2025-03-23 20:08:02.653738 | Job console starting... 2025-03-23 20:08:02.681923 | Updating repositories 2025-03-23 20:08:02.736744 | Preparing job workspace 2025-03-23 20:08:04.288200 | Running Ansible setup... 2025-03-23 20:08:09.192341 | PRE-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/pre.yaml@main] 2025-03-23 20:08:09.926117 | 2025-03-23 20:08:09.926258 | PLAY [Base pre] 2025-03-23 20:08:09.959025 | 2025-03-23 20:08:09.959157 | TASK [Setup log path fact] 2025-03-23 20:08:10.003684 | orchestrator | ok 2025-03-23 20:08:10.023269 | 2025-03-23 20:08:10.023426 | TASK [set-zuul-log-path-fact : Set log path for a build] 2025-03-23 20:08:10.055095 | orchestrator | ok 2025-03-23 20:08:10.070440 | 2025-03-23 20:08:10.070552 | TASK [emit-job-header : Print job information] 2025-03-23 20:08:10.123384 | # Job Information 2025-03-23 20:08:10.123559 | Ansible Version: 2.15.3 2025-03-23 20:08:10.123592 | Job: testbed-deploy-stable-in-a-nutshell-ubuntu-24.04 2025-03-23 20:08:10.123622 | Pipeline: post 2025-03-23 20:08:10.123643 | Executor: 7d211f194f6a 2025-03-23 20:08:10.123662 | Triggered by: https://github.com/osism/testbed/commit/3aa92b535ec3f95c42c00364e5f235ac84060556 2025-03-23 20:08:10.123681 | Event ID: 81fd230a-0822-11f0-9635-c8ee8c3c7e74 2025-03-23 20:08:10.130871 | 2025-03-23 20:08:10.130985 | LOOP [emit-job-header : Print node information] 2025-03-23 20:08:10.276948 | orchestrator | ok: 2025-03-23 20:08:10.277219 | orchestrator | # Node Information 2025-03-23 20:08:10.277259 | orchestrator | Inventory Hostname: orchestrator 2025-03-23 20:08:10.277284 | orchestrator | Hostname: zuul-static-regiocloud-infra-1 2025-03-23 20:08:10.277306 | orchestrator | Username: zuul-testbed02 2025-03-23 20:08:10.277326 | orchestrator | Distro: Debian 12.10 2025-03-23 20:08:10.277349 | orchestrator | Provider: static-testbed 2025-03-23 20:08:10.277369 | orchestrator | Label: testbed-orchestrator 2025-03-23 20:08:10.277428 | orchestrator | Product Name: OpenStack Nova 2025-03-23 20:08:10.277451 | orchestrator | Interface IP: 81.163.193.140 2025-03-23 20:08:10.296238 | 2025-03-23 20:08:10.296358 | TASK [log-inventory : Ensure Zuul Ansible directory exists] 2025-03-23 20:08:10.813110 | orchestrator -> localhost | changed 2025-03-23 20:08:10.822278 | 2025-03-23 20:08:10.822404 | TASK [log-inventory : Copy ansible inventory to logs dir] 2025-03-23 20:08:11.882928 | orchestrator -> localhost | changed 2025-03-23 20:08:11.905693 | 2025-03-23 20:08:11.905815 | TASK [add-build-sshkey : Check to see if ssh key was already created for this build] 2025-03-23 20:08:12.183652 | orchestrator -> localhost | ok 2025-03-23 20:08:12.200372 | 2025-03-23 20:08:12.200595 | TASK [add-build-sshkey : Create a new key in workspace based on build UUID] 2025-03-23 20:08:12.237460 | orchestrator | ok 2025-03-23 20:08:12.255061 | orchestrator | included: /var/lib/zuul/builds/86516234068b4cbdb5f6e1ee4173655a/trusted/project_1/opendev.org/zuul/zuul-jobs/roles/add-build-sshkey/tasks/create-key-and-replace.yaml 2025-03-23 20:08:12.263953 | 2025-03-23 20:08:12.264049 | TASK [add-build-sshkey : Create Temp SSH key] 2025-03-23 20:08:13.117188 | orchestrator -> localhost | Generating public/private rsa key pair. 2025-03-23 20:08:13.117732 | orchestrator -> localhost | Your identification has been saved in /var/lib/zuul/builds/86516234068b4cbdb5f6e1ee4173655a/work/86516234068b4cbdb5f6e1ee4173655a_id_rsa 2025-03-23 20:08:13.117839 | orchestrator -> localhost | Your public key has been saved in /var/lib/zuul/builds/86516234068b4cbdb5f6e1ee4173655a/work/86516234068b4cbdb5f6e1ee4173655a_id_rsa.pub 2025-03-23 20:08:13.117910 | orchestrator -> localhost | The key fingerprint is: 2025-03-23 20:08:13.117976 | orchestrator -> localhost | SHA256:5Ul1t6cjlE8Qd1r9oPMqM93sE6IMWt32taswepmupfQ zuul-build-sshkey 2025-03-23 20:08:13.118036 | orchestrator -> localhost | The key's randomart image is: 2025-03-23 20:08:13.118094 | orchestrator -> localhost | +---[RSA 3072]----+ 2025-03-23 20:08:13.118150 | orchestrator -> localhost | | +.o =| 2025-03-23 20:08:13.118206 | orchestrator -> localhost | | . =.=o| 2025-03-23 20:08:13.118283 | orchestrator -> localhost | | o o.ooo| 2025-03-23 20:08:13.118341 | orchestrator -> localhost | | + ooo .o| 2025-03-23 20:08:13.118415 | orchestrator -> localhost | | S.o..o+ | 2025-03-23 20:08:13.118472 | orchestrator -> localhost | | o . +.o..| 2025-03-23 20:08:13.118545 | orchestrator -> localhost | | o + Bo* o.| 2025-03-23 20:08:13.118604 | orchestrator -> localhost | | . . @+= =. | 2025-03-23 20:08:13.118665 | orchestrator -> localhost | | ++E ooo.| 2025-03-23 20:08:13.118724 | orchestrator -> localhost | +----[SHA256]-----+ 2025-03-23 20:08:13.119187 | orchestrator -> localhost | ok: Runtime: 0:00:00.356279 2025-03-23 20:08:13.136531 | 2025-03-23 20:08:13.136679 | TASK [add-build-sshkey : Remote setup ssh keys (linux)] 2025-03-23 20:08:13.170931 | orchestrator | ok 2025-03-23 20:08:13.183352 | orchestrator | included: /var/lib/zuul/builds/86516234068b4cbdb5f6e1ee4173655a/trusted/project_1/opendev.org/zuul/zuul-jobs/roles/add-build-sshkey/tasks/remote-linux.yaml 2025-03-23 20:08:13.194050 | 2025-03-23 20:08:13.194151 | TASK [add-build-sshkey : Remove previously added zuul-build-sshkey] 2025-03-23 20:08:13.228787 | orchestrator | skipping: Conditional result was False 2025-03-23 20:08:13.243543 | 2025-03-23 20:08:13.243675 | TASK [add-build-sshkey : Enable access via build key on all nodes] 2025-03-23 20:08:13.827882 | orchestrator | changed 2025-03-23 20:08:13.839520 | 2025-03-23 20:08:13.839643 | TASK [add-build-sshkey : Make sure user has a .ssh] 2025-03-23 20:08:14.113092 | orchestrator | ok 2025-03-23 20:08:14.125690 | 2025-03-23 20:08:14.125815 | TASK [add-build-sshkey : Install build private key as SSH key on all nodes] 2025-03-23 20:08:14.528375 | orchestrator | ok 2025-03-23 20:08:14.538927 | 2025-03-23 20:08:14.539195 | TASK [add-build-sshkey : Install build public key as SSH key on all nodes] 2025-03-23 20:08:14.941695 | orchestrator | ok 2025-03-23 20:08:14.949635 | 2025-03-23 20:08:14.949747 | TASK [add-build-sshkey : Remote setup ssh keys (windows)] 2025-03-23 20:08:14.976144 | orchestrator | skipping: Conditional result was False 2025-03-23 20:08:15.028020 | 2025-03-23 20:08:15.028147 | TASK [remove-zuul-sshkey : Remove master key from local agent] 2025-03-23 20:08:15.439470 | orchestrator -> localhost | changed 2025-03-23 20:08:15.455264 | 2025-03-23 20:08:15.455384 | TASK [add-build-sshkey : Add back temp key] 2025-03-23 20:08:15.802165 | orchestrator -> localhost | Identity added: /var/lib/zuul/builds/86516234068b4cbdb5f6e1ee4173655a/work/86516234068b4cbdb5f6e1ee4173655a_id_rsa (zuul-build-sshkey) 2025-03-23 20:08:15.802706 | orchestrator -> localhost | ok: Runtime: 0:00:00.015974 2025-03-23 20:08:15.818587 | 2025-03-23 20:08:15.818724 | TASK [add-build-sshkey : Verify we can still SSH to all nodes] 2025-03-23 20:08:16.182980 | orchestrator | ok 2025-03-23 20:08:16.192260 | 2025-03-23 20:08:16.192423 | TASK [add-build-sshkey : Verify we can still SSH to all nodes (windows)] 2025-03-23 20:08:16.227722 | orchestrator | skipping: Conditional result was False 2025-03-23 20:08:16.247603 | 2025-03-23 20:08:16.247722 | TASK [start-zuul-console : Start zuul_console daemon.] 2025-03-23 20:08:16.652121 | orchestrator | ok 2025-03-23 20:08:16.669314 | 2025-03-23 20:08:16.669445 | TASK [validate-host : Define zuul_info_dir fact] 2025-03-23 20:08:16.709628 | orchestrator | ok 2025-03-23 20:08:16.719018 | 2025-03-23 20:08:16.719122 | TASK [validate-host : Ensure Zuul Ansible directory exists] 2025-03-23 20:08:17.010136 | orchestrator -> localhost | ok 2025-03-23 20:08:17.019627 | 2025-03-23 20:08:17.019739 | TASK [validate-host : Collect information about the host] 2025-03-23 20:08:18.146383 | orchestrator | ok 2025-03-23 20:08:18.172070 | 2025-03-23 20:08:18.173029 | TASK [validate-host : Sanitize hostname] 2025-03-23 20:08:18.255852 | orchestrator | ok 2025-03-23 20:08:18.265217 | 2025-03-23 20:08:18.265341 | TASK [validate-host : Write out all ansible variables/facts known for each host] 2025-03-23 20:08:18.790359 | orchestrator -> localhost | changed 2025-03-23 20:08:18.803309 | 2025-03-23 20:08:18.803514 | TASK [validate-host : Collect information about zuul worker] 2025-03-23 20:08:19.329345 | orchestrator | ok 2025-03-23 20:08:19.340343 | 2025-03-23 20:08:19.340814 | TASK [validate-host : Write out all zuul information for each host] 2025-03-23 20:08:19.930461 | orchestrator -> localhost | changed 2025-03-23 20:08:19.944193 | 2025-03-23 20:08:19.944304 | TASK [prepare-workspace-log : Start zuul_console daemon.] 2025-03-23 20:08:20.242300 | orchestrator | ok 2025-03-23 20:08:20.263165 | 2025-03-23 20:08:20.263309 | TASK [prepare-workspace-log : Synchronize src repos to workspace directory.] 2025-03-23 20:09:05.448501 | orchestrator | changed: 2025-03-23 20:09:05.448756 | orchestrator | .d..t...... src/ 2025-03-23 20:09:05.448795 | orchestrator | .d..t...... src/github.com/ 2025-03-23 20:09:05.448820 | orchestrator | .d..t...... src/github.com/osism/ 2025-03-23 20:09:05.448841 | orchestrator | .d..t...... src/github.com/osism/ansible-collection-commons/ 2025-03-23 20:09:05.448862 | orchestrator | RedHat.yml 2025-03-23 20:09:05.462917 | orchestrator | .L..t...... src/github.com/osism/ansible-collection-commons/roles/repository/tasks/CentOS.yml -> RedHat.yml 2025-03-23 20:09:05.462935 | orchestrator | RedHat.yml 2025-03-23 20:09:05.462987 | orchestrator | = 2.2.0"... 2025-03-23 20:09:23.295612 | orchestrator | 20:09:23.295 STDOUT terraform: - Finding latest version of hashicorp/null... 2025-03-23 20:09:23.366711 | orchestrator | 20:09:23.366 STDOUT terraform: - Finding terraform-provider-openstack/openstack versions matching ">= 1.53.0"... 2025-03-23 20:09:24.357239 | orchestrator | 20:09:24.356 STDOUT terraform: - Installing terraform-provider-openstack/openstack v3.0.0... 2025-03-23 20:09:25.518767 | orchestrator | 20:09:25.518 STDOUT terraform: - Installed terraform-provider-openstack/openstack v3.0.0 (signed, key ID 4F80527A391BEFD2) 2025-03-23 20:09:26.691901 | orchestrator | 20:09:26.691 STDOUT terraform: - Installing hashicorp/local v2.5.2... 2025-03-23 20:09:27.668056 | orchestrator | 20:09:27.667 STDOUT terraform: - Installed hashicorp/local v2.5.2 (signed, key ID 0C0AF313E5FD9F80) 2025-03-23 20:09:28.813075 | orchestrator | 20:09:28.812 STDOUT terraform: - Installing hashicorp/null v3.2.3... 2025-03-23 20:09:29.508288 | orchestrator | 20:09:29.508 STDOUT terraform: - Installed hashicorp/null v3.2.3 (signed, key ID 0C0AF313E5FD9F80) 2025-03-23 20:09:29.508357 | orchestrator | 20:09:29.508 STDOUT terraform: Providers are signed by their developers. 2025-03-23 20:09:29.508370 | orchestrator | 20:09:29.508 STDOUT terraform: If you'd like to know more about provider signing, you can read about it here: 2025-03-23 20:09:29.508379 | orchestrator | 20:09:29.508 STDOUT terraform: https://opentofu.org/docs/cli/plugins/signing/ 2025-03-23 20:09:29.508387 | orchestrator | 20:09:29.508 STDOUT terraform: OpenTofu has created a lock file .terraform.lock.hcl to record the provider 2025-03-23 20:09:29.508431 | orchestrator | 20:09:29.508 STDOUT terraform: selections it made above. Include this file in your version control repository 2025-03-23 20:09:29.508611 | orchestrator | 20:09:29.508 STDOUT terraform: so that OpenTofu can guarantee to make the same selections by default when 2025-03-23 20:09:29.508676 | orchestrator | 20:09:29.508 STDOUT terraform: you run "tofu init" in the future. 2025-03-23 20:09:29.508687 | orchestrator | 20:09:29.508 STDOUT terraform: OpenTofu has been successfully initialized! 2025-03-23 20:09:29.508698 | orchestrator | 20:09:29.508 STDOUT terraform: You may now begin working with OpenTofu. Try running "tofu plan" to see 2025-03-23 20:09:29.508726 | orchestrator | 20:09:29.508 STDOUT terraform: any changes that are required for your infrastructure. All OpenTofu commands 2025-03-23 20:09:29.508735 | orchestrator | 20:09:29.508 STDOUT terraform: should now work. 2025-03-23 20:09:29.508753 | orchestrator | 20:09:29.508 STDOUT terraform: If you ever set or change modules or backend configuration for OpenTofu, 2025-03-23 20:09:29.508789 | orchestrator | 20:09:29.508 STDOUT terraform: rerun this command to reinitialize your working directory. If you forget, other 2025-03-23 20:09:29.508822 | orchestrator | 20:09:29.508 STDOUT terraform: commands will detect it and remind you to do so if necessary. 2025-03-23 20:09:29.651733 | orchestrator | 20:09:29.650 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed02/terraform` instead. 2025-03-23 20:09:29.842135 | orchestrator | 20:09:29.840 STDOUT terraform: Created and switched to workspace "ci"! 2025-03-23 20:09:30.016844 | orchestrator | 20:09:29.841 STDOUT terraform: You're now on a new, empty workspace. Workspaces isolate their state, 2025-03-23 20:09:30.016960 | orchestrator | 20:09:29.841 STDOUT terraform: so if you run "tofu plan" OpenTofu will not see any existing state 2025-03-23 20:09:30.016979 | orchestrator | 20:09:29.841 STDOUT terraform: for this configuration. 2025-03-23 20:09:30.017029 | orchestrator | 20:09:30.016 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed02/terraform` instead. 2025-03-23 20:09:30.094297 | orchestrator | 20:09:30.094 STDOUT terraform: ci.auto.tfvars 2025-03-23 20:09:30.272380 | orchestrator | 20:09:30.272 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed02/terraform` instead. 2025-03-23 20:09:31.102688 | orchestrator | 20:09:31.102 STDOUT terraform: data.openstack_networking_network_v2.public: Reading... 2025-03-23 20:09:31.602077 | orchestrator | 20:09:31.601 STDOUT terraform: data.openstack_networking_network_v2.public: Read complete after 1s [id=e6be7364-bfd8-4de7-8120-8f41c69a139a] 2025-03-23 20:09:31.813610 | orchestrator | 20:09:31.813 STDOUT terraform: OpenTofu used the selected providers to generate the following execution 2025-03-23 20:09:31.813687 | orchestrator | 20:09:31.813 STDOUT terraform: plan. Resource actions are indicated with the following symbols: 2025-03-23 20:09:31.813697 | orchestrator | 20:09:31.813 STDOUT terraform:  + create 2025-03-23 20:09:31.813880 | orchestrator | 20:09:31.813 STDOUT terraform:  <= read (data resources) 2025-03-23 20:09:31.813890 | orchestrator | 20:09:31.813 STDOUT terraform: OpenTofu will perform the following actions: 2025-03-23 20:09:31.813898 | orchestrator | 20:09:31.813 STDOUT terraform:  # data.openstack_images_image_v2.image will be read during apply 2025-03-23 20:09:31.813952 | orchestrator | 20:09:31.813 STDOUT terraform:  # (config refers to values not yet known) 2025-03-23 20:09:31.813965 | orchestrator | 20:09:31.813 STDOUT terraform:  <= data "openstack_images_image_v2" "image" { 2025-03-23 20:09:31.814038 | orchestrator | 20:09:31.813 STDOUT terraform:  + checksum = (known after apply) 2025-03-23 20:09:31.814049 | orchestrator | 20:09:31.813 STDOUT terraform:  + created_at = (known after apply) 2025-03-23 20:09:31.814055 | orchestrator | 20:09:31.813 STDOUT terraform:  + file = (known after apply) 2025-03-23 20:09:31.814061 | orchestrator | 20:09:31.814 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.814119 | orchestrator | 20:09:31.814 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.814198 | orchestrator | 20:09:31.814 STDOUT terraform:  + min_disk_gb = (known after apply) 2025-03-23 20:09:31.814274 | orchestrator | 20:09:31.814 STDOUT terraform:  + min_ram_mb = (known after apply) 2025-03-23 20:09:31.814286 | orchestrator | 20:09:31.814 STDOUT terraform:  + most_recent = true 2025-03-23 20:09:31.814293 | orchestrator | 20:09:31.814 STDOUT terraform:  + name = (known after apply) 2025-03-23 20:09:31.814314 | orchestrator | 20:09:31.814 STDOUT terraform:  + protected = (known after apply) 2025-03-23 20:09:31.814319 | orchestrator | 20:09:31.814 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.814326 | orchestrator | 20:09:31.814 STDOUT terraform:  + schema = (known after apply) 2025-03-23 20:09:31.814365 | orchestrator | 20:09:31.814 STDOUT terraform:  + size_bytes = (known after apply) 2025-03-23 20:09:31.814374 | orchestrator | 20:09:31.814 STDOUT terraform:  + tags = (known after apply) 2025-03-23 20:09:31.814451 | orchestrator | 20:09:31.814 STDOUT terraform:  + updated_at = (known after apply) 2025-03-23 20:09:31.814588 | orchestrator | 20:09:31.814 STDOUT terraform:  } 2025-03-23 20:09:31.814601 | orchestrator | 20:09:31.814 STDOUT terraform:  # data.openstack_images_image_v2.image_node will be read during apply 2025-03-23 20:09:31.814609 | orchestrator | 20:09:31.814 STDOUT terraform:  # (config refers to values not yet known) 2025-03-23 20:09:31.814666 | orchestrator | 20:09:31.814 STDOUT terraform:  <= data "openstack_images_image_v2" "image_node" { 2025-03-23 20:09:31.814677 | orchestrator | 20:09:31.814 STDOUT terraform:  + checksum = (known after apply) 2025-03-23 20:09:31.814744 | orchestrator | 20:09:31.814 STDOUT terraform:  + created_at = (known after apply) 2025-03-23 20:09:31.814820 | orchestrator | 20:09:31.814 STDOUT terraform:  + file = (known after apply) 2025-03-23 20:09:31.814829 | orchestrator | 20:09:31.814 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.814898 | orchestrator | 20:09:31.814 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.814905 | orchestrator | 20:09:31.814 STDOUT terraform:  + min_disk_gb = (known after apply) 2025-03-23 20:09:31.814911 | orchestrator | 20:09:31.814 STDOUT terraform:  + min_ram_mb = (known after apply) 2025-03-23 20:09:31.814977 | orchestrator | 20:09:31.814 STDOUT terraform:  + most_recent = true 2025-03-23 20:09:31.814983 | orchestrator | 20:09:31.814 STDOUT terraform:  + name = (known after apply) 2025-03-23 20:09:31.814988 | orchestrator | 20:09:31.814 STDOUT terraform:  + protected = (known after apply) 2025-03-23 20:09:31.814995 | orchestrator | 20:09:31.814 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.815054 | orchestrator | 20:09:31.814 STDOUT terraform:  + schema = (known after apply) 2025-03-23 20:09:31.815062 | orchestrator | 20:09:31.814 STDOUT terraform:  + size_bytes = (known after apply) 2025-03-23 20:09:31.815068 | orchestrator | 20:09:31.814 STDOUT terraform:  + tags = (known after apply) 2025-03-23 20:09:31.815183 | orchestrator | 20:09:31.815 STDOUT terraform:  + updated_at = (known after apply) 2025-03-23 20:09:31.815191 | orchestrator | 20:09:31.815 STDOUT terraform:  } 2025-03-23 20:09:31.815198 | orchestrator | 20:09:31.815 STDOUT terraform:  # local_file.MANAGER_ADDRESS will be created 2025-03-23 20:09:31.815260 | orchestrator | 20:09:31.815 STDOUT terraform:  + resource "local_file" "MANAGER_ADDRESS" { 2025-03-23 20:09:31.815269 | orchestrator | 20:09:31.815 STDOUT terraform:  + content = (known after apply) 2025-03-23 20:09:31.815339 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-03-23 20:09:31.815357 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-03-23 20:09:31.815417 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_md5 = (known after apply) 2025-03-23 20:09:31.815429 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_sha1 = (known after apply) 2025-03-23 20:09:31.815495 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_sha256 = (known after apply) 2025-03-23 20:09:31.815504 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_sha512 = (known after apply) 2025-03-23 20:09:31.815574 | orchestrator | 20:09:31.815 STDOUT terraform:  + directory_permission = "0777" 2025-03-23 20:09:31.815581 | orchestrator | 20:09:31.815 STDOUT terraform:  + file_permission = "0644" 2025-03-23 20:09:31.815589 | orchestrator | 20:09:31.815 STDOUT terraform:  + filename = ".MANAGER_ADDRESS.ci" 2025-03-23 20:09:31.815707 | orchestrator | 20:09:31.815 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.815720 | orchestrator | 20:09:31.815 STDOUT terraform:  } 2025-03-23 20:09:31.815728 | orchestrator | 20:09:31.815 STDOUT terraform:  # local_file.id_rsa_pub will be created 2025-03-23 20:09:31.815778 | orchestrator | 20:09:31.815 STDOUT terraform:  + resource "local_file" "id_rsa_pub" { 2025-03-23 20:09:31.815787 | orchestrator | 20:09:31.815 STDOUT terraform:  + content = (known after apply) 2025-03-23 20:09:31.815859 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-03-23 20:09:31.815869 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-03-23 20:09:31.815938 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_md5 = (known after apply) 2025-03-23 20:09:31.815947 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_sha1 = (known after apply) 2025-03-23 20:09:31.816017 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_sha256 = (known after apply) 2025-03-23 20:09:31.816027 | orchestrator | 20:09:31.815 STDOUT terraform:  + content_sha512 = (known after apply) 2025-03-23 20:09:31.816033 | orchestrator | 20:09:31.815 STDOUT terraform:  + directory_permission = "0777" 2025-03-23 20:09:31.816038 | orchestrator | 20:09:31.815 STDOUT terraform:  + file_permission = "0644" 2025-03-23 20:09:31.816045 | orchestrator | 20:09:31.816 STDOUT terraform:  + filename = ".id_rsa.ci.pub" 2025-03-23 20:09:31.816096 | orchestrator | 20:09:31.816 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.816200 | orchestrator | 20:09:31.816 STDOUT terraform:  } 2025-03-23 20:09:31.816211 | orchestrator | 20:09:31.816 STDOUT terraform:  # local_file.inventory will be created 2025-03-23 20:09:31.816218 | orchestrator | 20:09:31.816 STDOUT terraform:  + resource "local_file" "inventory" { 2025-03-23 20:09:31.816280 | orchestrator | 20:09:31.816 STDOUT terraform:  + content = (known after apply) 2025-03-23 20:09:31.816288 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-03-23 20:09:31.816359 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-03-23 20:09:31.816436 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_md5 = (known after apply) 2025-03-23 20:09:31.816485 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_sha1 = (known after apply) 2025-03-23 20:09:31.816514 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_sha256 = (known after apply) 2025-03-23 20:09:31.816521 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_sha512 = (known after apply) 2025-03-23 20:09:31.816528 | orchestrator | 20:09:31.816 STDOUT terraform:  + directory_permission = "0777" 2025-03-23 20:09:31.816583 | orchestrator | 20:09:31.816 STDOUT terraform:  + file_permission = "0644" 2025-03-23 20:09:31.816660 | orchestrator | 20:09:31.816 STDOUT terraform:  + filename = "inventory.ci" 2025-03-23 20:09:31.816669 | orchestrator | 20:09:31.816 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.816767 | orchestrator | 20:09:31.816 STDOUT terraform:  } 2025-03-23 20:09:31.816777 | orchestrator | 20:09:31.816 STDOUT terraform:  # local_sensitive_file.id_rsa will be created 2025-03-23 20:09:31.816784 | orchestrator | 20:09:31.816 STDOUT terraform:  + resource "local_sensitive_file" "id_rsa" { 2025-03-23 20:09:31.816847 | orchestrator | 20:09:31.816 STDOUT terraform:  + content = (sensitive value) 2025-03-23 20:09:31.816856 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-03-23 20:09:31.816927 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-03-23 20:09:31.817003 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_md5 = (known after apply) 2025-03-23 20:09:31.817017 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_sha1 = (known after apply) 2025-03-23 20:09:31.817023 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_sha256 = (known after apply) 2025-03-23 20:09:31.817029 | orchestrator | 20:09:31.816 STDOUT terraform:  + content_sha512 = (known after apply) 2025-03-23 20:09:31.817038 | orchestrator | 20:09:31.817 STDOUT terraform:  + directory_permission = "0700" 2025-03-23 20:09:31.817083 | orchestrator | 20:09:31.817 STDOUT terraform:  + file_permission = "0600" 2025-03-23 20:09:31.817091 | orchestrator | 20:09:31.817 STDOUT terraform:  + filename = ".id_rsa.ci" 2025-03-23 20:09:31.817174 | orchestrator | 20:09:31.817 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.817246 | orchestrator | 20:09:31.817 STDOUT terraform:  } 2025-03-23 20:09:31.817257 | orchestrator | 20:09:31.817 STDOUT terraform:  # null_resource.node_semaphore will be created 2025-03-23 20:09:31.817381 | orchestrator | 20:09:31.817 STDOUT terraform:  + resource "null_resource" "node_semaphore" { 2025-03-23 20:09:31.817398 | orchestrator | 20:09:31.817 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.817404 | orchestrator | 20:09:31.817 STDOUT terraform:  } 2025-03-23 20:09:31.817411 | orchestrator | 20:09:31.817 STDOUT terraform:  # openstack_blockstorage_volume_v3.manager_base_volume[0] will be created 2025-03-23 20:09:31.817419 | orchestrator | 20:09:31.817 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "manager_base_volume" { 2025-03-23 20:09:31.817481 | orchestrator | 20:09:31.817 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.817558 | orchestrator | 20:09:31.817 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.817578 | orchestrator | 20:09:31.817 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.817637 | orchestrator | 20:09:31.817 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.817646 | orchestrator | 20:09:31.817 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.817654 | orchestrator | 20:09:31.817 STDOUT terraform:  + name = "testbed-volume-manager-base" 2025-03-23 20:09:31.817717 | orchestrator | 20:09:31.817 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.817726 | orchestrator | 20:09:31.817 STDOUT terraform:  + size = 80 2025-03-23 20:09:31.817734 | orchestrator | 20:09:31.817 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.817833 | orchestrator | 20:09:31.817 STDOUT terraform:  } 2025-03-23 20:09:31.817845 | orchestrator | 20:09:31.817 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[0] will be created 2025-03-23 20:09:31.817914 | orchestrator | 20:09:31.817 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-23 20:09:31.817924 | orchestrator | 20:09:31.817 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.817989 | orchestrator | 20:09:31.817 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.817997 | orchestrator | 20:09:31.817 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.818004 | orchestrator | 20:09:31.817 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.818041 | orchestrator | 20:09:31.817 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.818052 | orchestrator | 20:09:31.817 STDOUT terraform:  + name = "testbed-volume-0-node-base" 2025-03-23 20:09:31.818100 | orchestrator | 20:09:31.818 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.818175 | orchestrator | 20:09:31.818 STDOUT terraform:  + size = 80 2025-03-23 20:09:31.818188 | orchestrator | 20:09:31.818 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.818259 | orchestrator | 20:09:31.818 STDOUT terraform:  } 2025-03-23 20:09:31.818277 | orchestrator | 20:09:31.818 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[1] will be created 2025-03-23 20:09:31.818306 | orchestrator | 20:09:31.818 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-23 20:09:31.818336 | orchestrator | 20:09:31.818 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.818377 | orchestrator | 20:09:31.818 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.818385 | orchestrator | 20:09:31.818 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.818419 | orchestrator | 20:09:31.818 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.818476 | orchestrator | 20:09:31.818 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.818516 | orchestrator | 20:09:31.818 STDOUT terraform:  + name = "testbed-volume-1-node-base" 2025-03-23 20:09:31.818566 | orchestrator | 20:09:31.818 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.818588 | orchestrator | 20:09:31.818 STDOUT terraform:  + size = 80 2025-03-23 20:09:31.818596 | orchestrator | 20:09:31.818 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.818603 | orchestrator | 20:09:31.818 STDOUT terraform:  } 2025-03-23 20:09:31.818746 | orchestrator | 20:09:31.818 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[2] will be created 2025-03-23 20:09:31.818794 | orchestrator | 20:09:31.818 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-23 20:09:31.818827 | orchestrator | 20:09:31.818 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.818868 | orchestrator | 20:09:31.818 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.818875 | orchestrator | 20:09:31.818 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.818912 | orchestrator | 20:09:31.818 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.818955 | orchestrator | 20:09:31.818 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.818988 | orchestrator | 20:09:31.818 STDOUT terraform:  + name = "testbed-volume-2-node-base" 2025-03-23 20:09:31.819038 | orchestrator | 20:09:31.818 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.819058 | orchestrator | 20:09:31.819 STDOUT terraform:  + size = 80 2025-03-23 20:09:31.819066 | orchestrator | 20:09:31.819 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.819184 | orchestrator | 20:09:31.819 STDOUT terraform:  } 2025-03-23 20:09:31.819192 | orchestrator | 20:09:31.819 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[3] will be created 2025-03-23 20:09:31.819234 | orchestrator | 20:09:31.819 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-23 20:09:31.819263 | orchestrator | 20:09:31.819 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.819295 | orchestrator | 20:09:31.819 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.819320 | orchestrator | 20:09:31.819 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.819351 | orchestrator | 20:09:31.819 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.819397 | orchestrator | 20:09:31.819 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.819422 | orchestrator | 20:09:31.819 STDOUT terraform:  + name = "testbed-volume-3-node-base" 2025-03-23 20:09:31.819495 | orchestrator | 20:09:31.819 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.819505 | orchestrator | 20:09:31.819 STDOUT terraform:  + size = 80 2025-03-23 20:09:31.819529 | orchestrator | 20:09:31.819 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.819537 | orchestrator | 20:09:31.819 STDOUT terraform:  } 2025-03-23 20:09:31.819637 | orchestrator | 20:09:31.819 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[4] will be created 2025-03-23 20:09:31.819689 | orchestrator | 20:09:31.819 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-23 20:09:31.819713 | orchestrator | 20:09:31.819 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.819728 | orchestrator | 20:09:31.819 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.819776 | orchestrator | 20:09:31.819 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.819795 | orchestrator | 20:09:31.819 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.819826 | orchestrator | 20:09:31.819 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.819866 | orchestrator | 20:09:31.819 STDOUT terraform:  + name = "testbed-volume-4-node-base" 2025-03-23 20:09:31.819896 | orchestrator | 20:09:31.819 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.819917 | orchestrator | 20:09:31.819 STDOUT terraform:  + size = 80 2025-03-23 20:09:31.819933 | orchestrator | 20:09:31.819 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.819939 | orchestrator | 20:09:31.819 STDOUT terraform:  } 2025-03-23 20:09:31.820058 | orchestrator | 20:09:31.820 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[5] will be created 2025-03-23 20:09:31.820105 | orchestrator | 20:09:31.820 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-23 20:09:31.820136 | orchestrator | 20:09:31.820 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.820168 | orchestrator | 20:09:31.820 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.820197 | orchestrator | 20:09:31.820 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.820230 | orchestrator | 20:09:31.820 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.820268 | orchestrator | 20:09:31.820 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.820300 | orchestrator | 20:09:31.820 STDOUT terraform:  + name = "testbed-volume-5-node-base" 2025-03-23 20:09:31.820339 | orchestrator | 20:09:31.820 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.820346 | orchestrator | 20:09:31.820 STDOUT terraform:  + size = 80 2025-03-23 20:09:31.820371 | orchestrator | 20:09:31.820 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.820378 | orchestrator | 20:09:31.820 STDOUT terraform:  } 2025-03-23 20:09:31.820498 | orchestrator | 20:09:31.820 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[0] will be created 2025-03-23 20:09:31.820542 | orchestrator | 20:09:31.820 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.820573 | orchestrator | 20:09:31.820 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.820615 | orchestrator | 20:09:31.820 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.820624 | orchestrator | 20:09:31.820 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.820656 | orchestrator | 20:09:31.820 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.820699 | orchestrator | 20:09:31.820 STDOUT terraform:  + name = "testbed-volume-0-node-0" 2025-03-23 20:09:31.820724 | orchestrator | 20:09:31.820 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.820745 | orchestrator | 20:09:31.820 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.820784 | orchestrator | 20:09:31.820 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.820897 | orchestrator | 20:09:31.820 STDOUT terraform:  } 2025-03-23 20:09:31.820905 | orchestrator | 20:09:31.820 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[1] will be created 2025-03-23 20:09:31.820947 | orchestrator | 20:09:31.820 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.820979 | orchestrator | 20:09:31.820 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.821015 | orchestrator | 20:09:31.820 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.821035 | orchestrator | 20:09:31.820 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.821067 | orchestrator | 20:09:31.821 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.821106 | orchestrator | 20:09:31.821 STDOUT terraform:  + name = "testbed-volume-1-node-1" 2025-03-23 20:09:31.821138 | orchestrator | 20:09:31.821 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.821168 | orchestrator | 20:09:31.821 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.821176 | orchestrator | 20:09:31.821 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.821195 | orchestrator | 20:09:31.821 STDOUT terraform:  } 2025-03-23 20:09:31.821304 | orchestrator | 20:09:31.821 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[2] will be created 2025-03-23 20:09:31.821360 | orchestrator | 20:09:31.821 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.821382 | orchestrator | 20:09:31.821 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.821405 | orchestrator | 20:09:31.821 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.821447 | orchestrator | 20:09:31.821 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.821480 | orchestrator | 20:09:31.821 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.821533 | orchestrator | 20:09:31.821 STDOUT terraform:  + name = "testbed-volume-2-node-2" 2025-03-23 20:09:31.821540 | orchestrator | 20:09:31.821 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.821585 | orchestrator | 20:09:31.821 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.821620 | orchestrator | 20:09:31.821 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.821729 | orchestrator | 20:09:31.821 STDOUT terraform:  } 2025-03-23 20:09:31.821736 | orchestrator | 20:09:31.821 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[3] will be created 2025-03-23 20:09:31.821789 | orchestrator | 20:09:31.821 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.821809 | orchestrator | 20:09:31.821 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.821830 | orchestrator | 20:09:31.821 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.821875 | orchestrator | 20:09:31.821 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.821895 | orchestrator | 20:09:31.821 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.821933 | orchestrator | 20:09:31.821 STDOUT terraform:  + name = "testbed-volume-3-node-3" 2025-03-23 20:09:31.821971 | orchestrator | 20:09:31.821 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.821990 | orchestrator | 20:09:31.821 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.822008 | orchestrator | 20:09:31.821 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.822029 | orchestrator | 20:09:31.821 STDOUT terraform:  } 2025-03-23 20:09:31.826284 | orchestrator | 20:09:31.826 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[4] will be created 2025-03-23 20:09:31.826374 | orchestrator | 20:09:31.826 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.826438 | orchestrator | 20:09:31.826 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.826493 | orchestrator | 20:09:31.826 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.826544 | orchestrator | 20:09:31.826 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.826601 | orchestrator | 20:09:31.826 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.826660 | orchestrator | 20:09:31.826 STDOUT terraform:  + name = "testbed-volume-4-node-4" 2025-03-23 20:09:31.826709 | orchestrator | 20:09:31.826 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.826755 | orchestrator | 20:09:31.826 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.826787 | orchestrator | 20:09:31.826 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.826821 | orchestrator | 20:09:31.826 STDOUT terraform:  } 2025-03-23 20:09:31.826882 | orchestrator | 20:09:31.826 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[5] will be created 2025-03-23 20:09:31.826949 | orchestrator | 20:09:31.826 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.827004 | orchestrator | 20:09:31.826 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.827035 | orchestrator | 20:09:31.827 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.827089 | orchestrator | 20:09:31.827 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.827140 | orchestrator | 20:09:31.827 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.827196 | orchestrator | 20:09:31.827 STDOUT terraform:  + name = "testbed-volume-5-node-5" 2025-03-23 20:09:31.827250 | orchestrator | 20:09:31.827 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.827281 | orchestrator | 20:09:31.827 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.827327 | orchestrator | 20:09:31.827 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.827352 | orchestrator | 20:09:31.827 STDOUT terraform:  } 2025-03-23 20:09:31.827423 | orchestrator | 20:09:31.827 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[6] will be created 2025-03-23 20:09:31.827508 | orchestrator | 20:09:31.827 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.827565 | orchestrator | 20:09:31.827 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.827605 | orchestrator | 20:09:31.827 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.827662 | orchestrator | 20:09:31.827 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.827717 | orchestrator | 20:09:31.827 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.827768 | orchestrator | 20:09:31.827 STDOUT terraform:  + name = "testbed-volume-6-node-0" 2025-03-23 20:09:31.827823 | orchestrator | 20:09:31.827 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.827863 | orchestrator | 20:09:31.827 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.827902 | orchestrator | 20:09:31.827 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.827927 | orchestrator | 20:09:31.827 STDOUT terraform:  } 2025-03-23 20:09:31.827995 | orchestrator | 20:09:31.827 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[7] will be created 2025-03-23 20:09:31.828062 | orchestrator | 20:09:31.828 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.828117 | orchestrator | 20:09:31.828 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.828148 | orchestrator | 20:09:31.828 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.828202 | orchestrator | 20:09:31.828 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.828243 | orchestrator | 20:09:31.828 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.828304 | orchestrator | 20:09:31.828 STDOUT terraform:  + name = "testbed-volume-7-node-1" 2025-03-23 20:09:31.828361 | orchestrator | 20:09:31.828 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.828393 | orchestrator | 20:09:31.828 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.828437 | orchestrator | 20:09:31.828 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.828475 | orchestrator | 20:09:31.828 STDOUT terraform:  } 2025-03-23 20:09:31.828544 | orchestrator | 20:09:31.828 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[8] will be created 2025-03-23 20:09:31.828615 | orchestrator | 20:09:31.828 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.828670 | orchestrator | 20:09:31.828 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.828706 | orchestrator | 20:09:31.828 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.828761 | orchestrator | 20:09:31.828 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.828817 | orchestrator | 20:09:31.828 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.828864 | orchestrator | 20:09:31.828 STDOUT terraform:  + name = "testbed-volume-8-node-2" 2025-03-23 20:09:31.828918 | orchestrator | 20:09:31.828 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.828957 | orchestrator | 20:09:31.828 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.828993 | orchestrator | 20:09:31.828 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.829016 | orchestrator | 20:09:31.829 STDOUT terraform:  } 2025-03-23 20:09:31.829094 | orchestrator | 20:09:31.829 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[9] will be created 2025-03-23 20:09:31.829163 | orchestrator | 20:09:31.829 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.829230 | orchestrator | 20:09:31.829 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.829261 | orchestrator | 20:09:31.829 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.829316 | orchestrator | 20:09:31.829 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.829371 | orchestrator | 20:09:31.829 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.829418 | orchestrator | 20:09:31.829 STDOUT terraform:  + name = "testbed-volume-9-node-3" 2025-03-23 20:09:31.829500 | orchestrator | 20:09:31.829 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.829547 | orchestrator | 20:09:31.829 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.829578 | orchestrator | 20:09:31.829 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.829615 | orchestrator | 20:09:31.829 STDOUT terraform:  } 2025-03-23 20:09:31.829685 | orchestrator | 20:09:31.829 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[10] will be created 2025-03-23 20:09:31.829763 | orchestrator | 20:09:31.829 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.829847 | orchestrator | 20:09:31.829 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.829881 | orchestrator | 20:09:31.829 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.829953 | orchestrator | 20:09:31.829 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.830026 | orchestrator | 20:09:31.829 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.830089 | orchestrator | 20:09:31.830 STDOUT terraform:  + name = "testbed-volume-10-node-4" 2025-03-23 20:09:31.830149 | orchestrator | 20:09:31.830 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.830180 | orchestrator | 20:09:31.830 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.830212 | orchestrator | 20:09:31.830 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.830235 | orchestrator | 20:09:31.830 STDOUT terraform:  } 2025-03-23 20:09:31.830303 | orchestrator | 20:09:31.830 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[11] will be created 2025-03-23 20:09:31.830356 | orchestrator | 20:09:31.830 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.830399 | orchestrator | 20:09:31.830 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.830429 | orchestrator | 20:09:31.830 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.830519 | orchestrator | 20:09:31.830 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.830562 | orchestrator | 20:09:31.830 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.838108 | orchestrator | 20:09:31.830 STDOUT terraform:  + name = "testbed-volume-11-node-5" 2025-03-23 20:09:31.838160 | orchestrator | 20:09:31.831 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.838168 | orchestrator | 20:09:31.831 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.838173 | orchestrator | 20:09:31.831 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.838179 | orchestrator | 20:09:31.831 STDOUT terraform:  } 2025-03-23 20:09:31.838184 | orchestrator | 20:09:31.831 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[12] will be created 2025-03-23 20:09:31.838189 | orchestrator | 20:09:31.831 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.838195 | orchestrator | 20:09:31.831 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.838199 | orchestrator | 20:09:31.832 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.838205 | orchestrator | 20:09:31.832 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.838210 | orchestrator | 20:09:31.832 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.838215 | orchestrator | 20:09:31.832 STDOUT terraform:  + name = "testbed-volume-12-node-0" 2025-03-23 20:09:31.838220 | orchestrator | 20:09:31.832 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.838225 | orchestrator | 20:09:31.832 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.838230 | orchestrator | 20:09:31.832 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.838235 | orchestrator | 20:09:31.832 STDOUT terraform:  } 2025-03-23 20:09:31.838240 | orchestrator | 20:09:31.832 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[13] will be created 2025-03-23 20:09:31.838245 | orchestrator | 20:09:31.832 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.838250 | orchestrator | 20:09:31.832 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.838255 | orchestrator | 20:09:31.832 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.838260 | orchestrator | 20:09:31.832 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.838265 | orchestrator | 20:09:31.832 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.838270 | orchestrator | 20:09:31.832 STDOUT terraform:  + name = "testbed-volume-13-node-1" 2025-03-23 20:09:31.838275 | orchestrator | 20:09:31.832 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.838280 | orchestrator | 20:09:31.832 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.838285 | orchestrator | 20:09:31.832 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.838290 | orchestrator | 20:09:31.832 STDOUT terraform:  } 2025-03-23 20:09:31.838295 | orchestrator | 20:09:31.832 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[14] will be created 2025-03-23 20:09:31.838300 | orchestrator | 20:09:31.832 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.838305 | orchestrator | 20:09:31.832 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.838310 | orchestrator | 20:09:31.832 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.838318 | orchestrator | 20:09:31.832 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.838323 | orchestrator | 20:09:31.832 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.838328 | orchestrator | 20:09:31.832 STDOUT terraform:  + name = "testbed-volume-14-node-2" 2025-03-23 20:09:31.838333 | orchestrator | 20:09:31.832 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.838338 | orchestrator | 20:09:31.832 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.838343 | orchestrator | 20:09:31.832 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.838348 | orchestrator | 20:09:31.832 STDOUT terraform:  } 2025-03-23 20:09:31.838358 | orchestrator | 20:09:31.832 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[15] will be created 2025-03-23 20:09:31.838363 | orchestrator | 20:09:31.832 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.838368 | orchestrator | 20:09:31.832 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.838373 | orchestrator | 20:09:31.832 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.838378 | orchestrator | 20:09:31.832 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.838385 | orchestrator | 20:09:31.832 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.838390 | orchestrator | 20:09:31.832 STDOUT terraform:  + name = "testbed-volume-15-node-3" 2025-03-23 20:09:31.838395 | orchestrator | 20:09:31.833 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.838400 | orchestrator | 20:09:31.833 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.838405 | orchestrator | 20:09:31.833 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.838410 | orchestrator | 20:09:31.833 STDOUT terraform:  } 2025-03-23 20:09:31.838415 | orchestrator | 20:09:31.833 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[16] will be created 2025-03-23 20:09:31.838420 | orchestrator | 20:09:31.833 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.838425 | orchestrator | 20:09:31.833 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.838430 | orchestrator | 20:09:31.833 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.838435 | orchestrator | 20:09:31.833 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.838440 | orchestrator | 20:09:31.833 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.838445 | orchestrator | 20:09:31.833 STDOUT terraform:  + name = "testbed-volume-16-node-4" 2025-03-23 20:09:31.838450 | orchestrator | 20:09:31.833 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.838465 | orchestrator | 20:09:31.833 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.838471 | orchestrator | 20:09:31.833 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.838476 | orchestrator | 20:09:31.833 STDOUT terraform:  } 2025-03-23 20:09:31.838481 | orchestrator | 20:09:31.833 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[17] will be created 2025-03-23 20:09:31.838489 | orchestrator | 20:09:31.833 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-23 20:09:31.838494 | orchestrator | 20:09:31.833 STDOUT terraform:  + attachment = (known after apply) 2025-03-23 20:09:31.838499 | orchestrator | 20:09:31.833 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.838504 | orchestrator | 20:09:31.833 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.838509 | orchestrator | 20:09:31.833 STDOUT terraform:  + metadata = (known after apply) 2025-03-23 20:09:31.838514 | orchestrator | 20:09:31.833 STDOUT terraform:  + name = "testbed-volume-17-node-5" 2025-03-23 20:09:31.838519 | orchestrator | 20:09:31.833 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.838524 | orchestrator | 20:09:31.833 STDOUT terraform:  + size = 20 2025-03-23 20:09:31.838529 | orchestrator | 20:09:31.833 STDOUT terraform:  + volume_type = "ssd" 2025-03-23 20:09:31.838534 | orchestrator | 20:09:31.833 STDOUT terraform:  } 2025-03-23 20:09:31.838540 | orchestrator | 20:09:31.833 STDOUT terraform:  # openstack_compute_instance_v2.manager_server will be created 2025-03-23 20:09:31.838545 | orchestrator | 20:09:31.833 STDOUT terraform:  + resource "openstack_compute_instance_v2" "manager_server" { 2025-03-23 20:09:31.838550 | orchestrator | 20:09:31.833 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-23 20:09:31.838555 | orchestrator | 20:09:31.833 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-23 20:09:31.838566 | orchestrator | 20:09:31.833 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-23 20:09:31.838571 | orchestrator | 20:09:31.833 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.838576 | orchestrator | 20:09:31.833 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.838582 | orchestrator | 20:09:31.833 STDOUT terraform:  + config_drive = true 2025-03-23 20:09:31.838587 | orchestrator | 20:09:31.833 STDOUT terraform:  + created = (known after apply) 2025-03-23 20:09:31.838592 | orchestrator | 20:09:31.833 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-23 20:09:31.838596 | orchestrator | 20:09:31.833 STDOUT terraform:  + flavor_name = "OSISM-4V-16" 2025-03-23 20:09:31.838601 | orchestrator | 20:09:31.833 STDOUT terraform:  + force_delete = false 2025-03-23 20:09:31.838606 | orchestrator | 20:09:31.833 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.838611 | orchestrator | 20:09:31.833 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.838616 | orchestrator | 20:09:31.834 STDOUT terraform:  + image_name = (known after apply) 2025-03-23 20:09:31.838621 | orchestrator | 20:09:31.834 STDOUT terraform:  + key_pair = "testbed" 2025-03-23 20:09:31.838626 | orchestrator | 20:09:31.834 STDOUT terraform:  + name = "testbed-manager" 2025-03-23 20:09:31.838631 | orchestrator | 20:09:31.834 STDOUT terraform:  + power_state = "active" 2025-03-23 20:09:31.838636 | orchestrator | 20:09:31.834 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.838641 | orchestrator | 20:09:31.834 STDOUT terraform:  + security_groups = (known after apply) 2025-03-23 20:09:31.838650 | orchestrator | 20:09:31.834 STDOUT terraform:  + stop_before_destroy = false 2025-03-23 20:09:31.838655 | orchestrator | 20:09:31.834 STDOUT terraform:  + updated = (known after apply) 2025-03-23 20:09:31.838661 | orchestrator | 20:09:31.834 STDOUT terraform:  + user_data = (known after apply) 2025-03-23 20:09:31.838666 | orchestrator | 20:09:31.834 STDOUT terraform:  + block_device { 2025-03-23 20:09:31.838672 | orchestrator | 20:09:31.834 STDOUT terraform:  + boot_index = 0 2025-03-23 20:09:31.838677 | orchestrator | 20:09:31.834 STDOUT terraform:  + delete_on_termination = false 2025-03-23 20:09:31.838682 | orchestrator | 20:09:31.834 STDOUT terraform:  + destination_type = "volume" 2025-03-23 20:09:31.838687 | orchestrator | 20:09:31.834 STDOUT terraform:  + multiattach = false 2025-03-23 20:09:31.838692 | orchestrator | 20:09:31.834 STDOUT terraform:  + source_type = "volume" 2025-03-23 20:09:31.838697 | orchestrator | 20:09:31.834 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.838702 | orchestrator | 20:09:31.834 STDOUT terraform:  } 2025-03-23 20:09:31.838707 | orchestrator | 20:09:31.834 STDOUT terraform:  + network { 2025-03-23 20:09:31.838712 | orchestrator | 20:09:31.834 STDOUT terraform:  + access_network = false 2025-03-23 20:09:31.838717 | orchestrator | 20:09:31.834 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-23 20:09:31.838722 | orchestrator | 20:09:31.834 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-23 20:09:31.838727 | orchestrator | 20:09:31.834 STDOUT terraform:  + mac = (known after apply) 2025-03-23 20:09:31.838732 | orchestrator | 20:09:31.834 STDOUT terraform:  + name = (known after apply) 2025-03-23 20:09:31.838737 | orchestrator | 20:09:31.834 STDOUT terraform:  + port = (known after apply) 2025-03-23 20:09:31.838742 | orchestrator | 20:09:31.834 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.838768 | orchestrator | 20:09:31.834 STDOUT terraform:  } 2025-03-23 20:09:31.838773 | orchestrator | 20:09:31.834 STDOUT terraform:  } 2025-03-23 20:09:31.838778 | orchestrator | 20:09:31.834 STDOUT terraform:  # openstack_compute_instance_v2.node_server[0] will be created 2025-03-23 20:09:31.838783 | orchestrator | 20:09:31.834 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-23 20:09:31.838791 | orchestrator | 20:09:31.834 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-23 20:09:31.838796 | orchestrator | 20:09:31.834 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-23 20:09:31.838801 | orchestrator | 20:09:31.834 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-23 20:09:31.838806 | orchestrator | 20:09:31.834 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.838811 | orchestrator | 20:09:31.834 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.838816 | orchestrator | 20:09:31.834 STDOUT terraform:  + config_drive = true 2025-03-23 20:09:31.838821 | orchestrator | 20:09:31.834 STDOUT terraform:  + created = (known after apply) 2025-03-23 20:09:31.838830 | orchestrator | 20:09:31.834 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-23 20:09:31.838835 | orchestrator | 20:09:31.835 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-23 20:09:31.838840 | orchestrator | 20:09:31.835 STDOUT terraform:  + force_delete = false 2025-03-23 20:09:31.838845 | orchestrator | 20:09:31.835 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.838850 | orchestrator | 20:09:31.835 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.838855 | orchestrator | 20:09:31.835 STDOUT terraform:  + image_name = (known after apply) 2025-03-23 20:09:31.838860 | orchestrator | 20:09:31.835 STDOUT terraform:  + key_pair = "testbed" 2025-03-23 20:09:31.838865 | orchestrator | 20:09:31.835 STDOUT terraform:  + name = "testbed-node-0" 2025-03-23 20:09:31.838870 | orchestrator | 20:09:31.835 STDOUT terraform:  + power_state = "active" 2025-03-23 20:09:31.838875 | orchestrator | 20:09:31.835 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.838880 | orchestrator | 20:09:31.835 STDOUT terraform:  + security_groups = (known after apply) 2025-03-23 20:09:31.838885 | orchestrator | 20:09:31.835 STDOUT terraform:  + stop_before_destroy = false 2025-03-23 20:09:31.838890 | orchestrator | 20:09:31.835 STDOUT terraform:  + updated = (known after apply) 2025-03-23 20:09:31.838895 | orchestrator | 20:09:31.835 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-23 20:09:31.838900 | orchestrator | 20:09:31.835 STDOUT terraform:  + block_device { 2025-03-23 20:09:31.838905 | orchestrator | 20:09:31.835 STDOUT terraform:  + boot_index = 0 2025-03-23 20:09:31.838912 | orchestrator | 20:09:31.835 STDOUT terraform:  + delete_on_termination = false 2025-03-23 20:09:31.838917 | orchestrator | 20:09:31.835 STDOUT terraform:  + destination_type = "volume" 2025-03-23 20:09:31.838922 | orchestrator | 20:09:31.835 STDOUT terraform:  + multiattach = false 2025-03-23 20:09:31.838927 | orchestrator | 20:09:31.835 STDOUT terraform:  + source_type = "volume" 2025-03-23 20:09:31.838932 | orchestrator | 20:09:31.835 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.838937 | orchestrator | 20:09:31.835 STDOUT terraform:  } 2025-03-23 20:09:31.838942 | orchestrator | 20:09:31.835 STDOUT terraform:  + network { 2025-03-23 20:09:31.838947 | orchestrator | 20:09:31.835 STDOUT terraform:  + access_network = false 2025-03-23 20:09:31.838952 | orchestrator | 20:09:31.835 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-23 20:09:31.838957 | orchestrator | 20:09:31.835 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-23 20:09:31.838962 | orchestrator | 20:09:31.835 STDOUT terraform:  + mac = (known after apply) 2025-03-23 20:09:31.838967 | orchestrator | 20:09:31.835 STDOUT terraform:  + name = (known after apply) 2025-03-23 20:09:31.838974 | orchestrator | 20:09:31.835 STDOUT terraform:  + port = (known after apply) 2025-03-23 20:09:31.838979 | orchestrator | 20:09:31.835 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.838988 | orchestrator | 20:09:31.835 STDOUT terraform:  } 2025-03-23 20:09:31.838995 | orchestrator | 20:09:31.835 STDOUT terraform:  } 2025-03-23 20:09:31.842189 | orchestrator | 20:09:31.835 STDOUT terraform:  # openstack_compute_instance_v2.node_server[1] will be created 2025-03-23 20:09:31.842225 | orchestrator | 20:09:31.835 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-23 20:09:31.842232 | orchestrator | 20:09:31.835 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-23 20:09:31.842238 | orchestrator | 20:09:31.835 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-23 20:09:31.842243 | orchestrator | 20:09:31.835 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-23 20:09:31.842248 | orchestrator | 20:09:31.835 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.842254 | orchestrator | 20:09:31.835 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.842260 | orchestrator | 20:09:31.835 STDOUT terraform:  + config_drive = true 2025-03-23 20:09:31.842265 | orchestrator | 20:09:31.835 STDOUT terraform:  + created = (known after apply) 2025-03-23 20:09:31.842270 | orchestrator | 20:09:31.836 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-23 20:09:31.842281 | orchestrator | 20:09:31.836 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-23 20:09:31.842286 | orchestrator | 20:09:31.836 STDOUT terraform:  + force_delete = false 2025-03-23 20:09:31.842292 | orchestrator | 20:09:31.836 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.842303 | orchestrator | 20:09:31.836 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.842309 | orchestrator | 20:09:31.841 STDOUT terraform:  + image_name = (known after apply) 2025-03-23 20:09:31.842314 | orchestrator | 20:09:31.842 STDOUT terraform:  + key_pair = "testbed" 2025-03-23 20:09:31.842320 | orchestrator | 20:09:31.842 STDOUT terraform:  + name = "testbed-node-1" 2025-03-23 20:09:31.842325 | orchestrator | 20:09:31.842 STDOUT terraform:  + power_state = "active" 2025-03-23 20:09:31.842330 | orchestrator | 20:09:31.842 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.842335 | orchestrator | 20:09:31.842 STDOUT terraform:  + security_groups = (known after apply) 2025-03-23 20:09:31.842340 | orchestrator | 20:09:31.842 STDOUT terraform:  + stop_before_destroy = false 2025-03-23 20:09:31.842345 | orchestrator | 20:09:31.842 STDOUT terraform:  + updated = (known after apply) 2025-03-23 20:09:31.842350 | orchestrator | 20:09:31.842 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-23 20:09:31.842355 | orchestrator | 20:09:31.842 STDOUT terraform:  + block_device { 2025-03-23 20:09:31.842362 | orchestrator | 20:09:31.842 STDOUT terraform:  + boot_index = 0 2025-03-23 20:09:31.842418 | orchestrator | 20:09:31.842 STDOUT terraform:  + delete_on_termination = false 2025-03-23 20:09:31.842427 | orchestrator | 20:09:31.842 STDOUT terraform:  + destination_type = "volume" 2025-03-23 20:09:31.842434 | orchestrator | 20:09:31.842 STDOUT terraform:  + multiattach = false 2025-03-23 20:09:31.842471 | orchestrator | 20:09:31.842 STDOUT terraform:  + source_type = "volume" 2025-03-23 20:09:31.842478 | orchestrator | 20:09:31.842 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.842569 | orchestrator | 20:09:31.842 STDOUT terraform:  } 2025-03-23 20:09:31.842579 | orchestrator | 20:09:31.842 STDOUT terraform:  + network { 2025-03-23 20:09:31.842584 | orchestrator | 20:09:31.842 STDOUT terraform:  + access_network = false 2025-03-23 20:09:31.842590 | orchestrator | 20:09:31.842 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-23 20:09:31.842596 | orchestrator | 20:09:31.842 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-23 20:09:31.842769 | orchestrator | 20:09:31.842 STDOUT terraform:  + mac = (known after apply) 2025-03-23 20:09:31.842872 | orchestrator | 20:09:31.842 STDOUT terraform:  + name = (known after apply) 2025-03-23 20:09:31.842883 | orchestrator | 20:09:31.842 STDOUT terraform:  + port = (known after apply) 2025-03-23 20:09:31.842888 | orchestrator | 20:09:31.842 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.842894 | orchestrator | 20:09:31.842 STDOUT terraform:  } 2025-03-23 20:09:31.842899 | orchestrator | 20:09:31.842 STDOUT terraform:  } 2025-03-23 20:09:31.842904 | orchestrator | 20:09:31.842 STDOUT terraform:  # openstack_compute_instance_v2.node_server[2] will be created 2025-03-23 20:09:31.842911 | orchestrator | 20:09:31.842 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-23 20:09:31.842941 | orchestrator | 20:09:31.842 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-23 20:09:31.842947 | orchestrator | 20:09:31.842 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-23 20:09:31.842956 | orchestrator | 20:09:31.842 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-23 20:09:31.842963 | orchestrator | 20:09:31.842 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.842968 | orchestrator | 20:09:31.842 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.842975 | orchestrator | 20:09:31.842 STDOUT terraform:  + config_drive = true 2025-03-23 20:09:31.843007 | orchestrator | 20:09:31.842 STDOUT terraform:  + created = (known after apply) 2025-03-23 20:09:31.843112 | orchestrator | 20:09:31.843 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-23 20:09:31.843119 | orchestrator | 20:09:31.843 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-23 20:09:31.843124 | orchestrator | 20:09:31.843 STDOUT terraform:  + force_delete = false 2025-03-23 20:09:31.843130 | orchestrator | 20:09:31.843 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.843192 | orchestrator | 20:09:31.843 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.843996 | orchestrator | 20:09:31.843 STDOUT terraform:  + image_name = (known after apply) 2025-03-23 20:09:31.844020 | orchestrator | 20:09:31.843 STDOUT terraform:  + key_pair = "testbed" 2025-03-23 20:09:31.844419 | orchestrator | 20:09:31.843 STDOUT terraform:  + name = "testbed-node-2" 2025-03-23 20:09:31.844443 | orchestrator | 20:09:31.843 STDOUT terraform:  + power_state = "active" 2025-03-23 20:09:31.844449 | orchestrator | 20:09:31.843 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.844469 | orchestrator | 20:09:31.843 STDOUT terraform:  + security_groups = (known after apply) 2025-03-23 20:09:31.844475 | orchestrator | 20:09:31.843 STDOUT terraform:  + stop_before_destroy = false 2025-03-23 20:09:31.844480 | orchestrator | 20:09:31.843 STDOUT terraform:  + updated = (known after apply) 2025-03-23 20:09:31.844485 | orchestrator | 20:09:31.843 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-23 20:09:31.844491 | orchestrator | 20:09:31.843 STDOUT terraform:  + block_device { 2025-03-23 20:09:31.844497 | orchestrator | 20:09:31.843 STDOUT terraform:  + boot_index = 0 2025-03-23 20:09:31.844502 | orchestrator | 20:09:31.843 STDOUT terraform:  + delete_on_termination = false 2025-03-23 20:09:31.844507 | orchestrator | 20:09:31.843 STDOUT terraform:  + destination_type = "volume" 2025-03-23 20:09:31.844512 | orchestrator | 20:09:31.843 STDOUT terraform:  + multiattach = false 2025-03-23 20:09:31.844517 | orchestrator | 20:09:31.843 STDOUT terraform:  + source_type = "volume" 2025-03-23 20:09:31.844522 | orchestrator | 20:09:31.843 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.844527 | orchestrator | 20:09:31.843 STDOUT terraform:  } 2025-03-23 20:09:31.844532 | orchestrator | 20:09:31.843 STDOUT terraform:  + network { 2025-03-23 20:09:31.844537 | orchestrator | 20:09:31.843 STDOUT terraform:  + access_network = false 2025-03-23 20:09:31.844542 | orchestrator | 20:09:31.843 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-23 20:09:31.844547 | orchestrator | 20:09:31.843 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-23 20:09:31.844552 | orchestrator | 20:09:31.843 STDOUT terraform:  + mac = (known after apply) 2025-03-23 20:09:31.844557 | orchestrator | 20:09:31.843 STDOUT terraform:  + name = (known after apply) 2025-03-23 20:09:31.844562 | orchestrator | 20:09:31.843 STDOUT terraform:  + port = (known after apply) 2025-03-23 20:09:31.844567 | orchestrator | 20:09:31.843 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.844571 | orchestrator | 20:09:31.843 STDOUT terraform:  } 2025-03-23 20:09:31.844577 | orchestrator | 20:09:31.843 STDOUT terraform:  } 2025-03-23 20:09:31.844582 | orchestrator | 20:09:31.843 STDOUT terraform:  # openstack_compute_instance_v2.node_server[3] will be created 2025-03-23 20:09:31.844587 | orchestrator | 20:09:31.843 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-23 20:09:31.844592 | orchestrator | 20:09:31.843 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-23 20:09:31.844597 | orchestrator | 20:09:31.843 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-23 20:09:31.844602 | orchestrator | 20:09:31.843 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-23 20:09:31.844607 | orchestrator | 20:09:31.843 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.844620 | orchestrator | 20:09:31.843 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.844629 | orchestrator | 20:09:31.844 STDOUT terraform:  + config_drive = true 2025-03-23 20:09:31.845502 | orchestrator | 20:09:31.844 STDOUT terraform:  + created = (known after apply) 2025-03-23 20:09:31.845527 | orchestrator | 20:09:31.844 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-23 20:09:31.845539 | orchestrator | 20:09:31.844 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-23 20:09:31.845545 | orchestrator | 20:09:31.844 STDOUT terraform:  + force_delete = false 2025-03-23 20:09:31.845550 | orchestrator | 20:09:31.844 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.845558 | orchestrator | 20:09:31.844 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.845563 | orchestrator | 20:09:31.844 STDOUT terraform:  + image_name = (known after apply) 2025-03-23 20:09:31.845568 | orchestrator | 20:09:31.844 STDOUT terraform:  + key_pair = "testbed" 2025-03-23 20:09:31.845573 | orchestrator | 20:09:31.844 STDOUT terraform:  + name = "testbed-node-3" 2025-03-23 20:09:31.845578 | orchestrator | 20:09:31.844 STDOUT terraform:  + power_state = "active" 2025-03-23 20:09:31.845583 | orchestrator | 20:09:31.844 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.845588 | orchestrator | 20:09:31.844 STDOUT terraform:  + security_groups = (known after apply) 2025-03-23 20:09:31.845593 | orchestrator | 20:09:31.844 STDOUT terraform:  + stop_before_destroy = false 2025-03-23 20:09:31.845598 | orchestrator | 20:09:31.844 STDOUT terraform:  + updated = (known after apply) 2025-03-23 20:09:31.845603 | orchestrator | 20:09:31.844 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-23 20:09:31.845608 | orchestrator | 20:09:31.844 STDOUT terraform:  + block_device { 2025-03-23 20:09:31.845614 | orchestrator | 20:09:31.844 STDOUT terraform:  + boot_index = 0 2025-03-23 20:09:31.845619 | orchestrator | 20:09:31.844 STDOUT terraform:  + delete_on_termination = false 2025-03-23 20:09:31.845624 | orchestrator | 20:09:31.844 STDOUT terraform:  + destination_type = "volume" 2025-03-23 20:09:31.845629 | orchestrator | 20:09:31.844 STDOUT terraform:  + multiattach = false 2025-03-23 20:09:31.845633 | orchestrator | 20:09:31.844 STDOUT terraform:  + source_type = "volume" 2025-03-23 20:09:31.845644 | orchestrator | 20:09:31.844 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.845649 | orchestrator | 20:09:31.844 STDOUT terraform:  } 2025-03-23 20:09:31.845654 | orchestrator | 20:09:31.844 STDOUT terraform:  + network { 2025-03-23 20:09:31.845659 | orchestrator | 20:09:31.844 STDOUT terraform:  + access_network = false 2025-03-23 20:09:31.845664 | orchestrator | 20:09:31.844 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-23 20:09:31.845669 | orchestrator | 20:09:31.844 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-23 20:09:31.845674 | orchestrator | 20:09:31.844 STDOUT terraform:  + mac = (known after apply) 2025-03-23 20:09:31.845686 | orchestrator | 20:09:31.844 STDOUT terraform:  + name = (known after apply) 2025-03-23 20:09:31.845692 | orchestrator | 20:09:31.844 STDOUT terraform:  + port = (known after apply) 2025-03-23 20:09:31.845697 | orchestrator | 20:09:31.844 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.845702 | orchestrator | 20:09:31.844 STDOUT terraform:  } 2025-03-23 20:09:31.845707 | orchestrator | 20:09:31.844 STDOUT terraform:  } 2025-03-23 20:09:31.845713 | orchestrator | 20:09:31.844 STDOUT terraform:  # openstack_compute_instance_v2.node_server[4] will be created 2025-03-23 20:09:31.845718 | orchestrator | 20:09:31.844 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-23 20:09:31.845723 | orchestrator | 20:09:31.844 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-23 20:09:31.845728 | orchestrator | 20:09:31.844 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-23 20:09:31.845733 | orchestrator | 20:09:31.844 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-23 20:09:31.845738 | orchestrator | 20:09:31.845 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.845743 | orchestrator | 20:09:31.845 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.845748 | orchestrator | 20:09:31.845 STDOUT terraform:  + config_drive = true 2025-03-23 20:09:31.845753 | orchestrator | 20:09:31.845 STDOUT terraform:  + created = (known after apply) 2025-03-23 20:09:31.845758 | orchestrator | 20:09:31.845 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-23 20:09:31.845764 | orchestrator | 20:09:31.845 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-23 20:09:31.845769 | orchestrator | 20:09:31.845 STDOUT terraform:  + force_delete = false 2025-03-23 20:09:31.845774 | orchestrator | 20:09:31.845 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.845779 | orchestrator | 20:09:31.845 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.845784 | orchestrator | 20:09:31.845 STDOUT terraform:  + image_name = (known after apply) 2025-03-23 20:09:31.845789 | orchestrator | 20:09:31.845 STDOUT terraform:  + key_pair = "testbed" 2025-03-23 20:09:31.845794 | orchestrator | 20:09:31.845 STDOUT terraform:  + name = "testbed-node-4" 2025-03-23 20:09:31.845799 | orchestrator | 20:09:31.845 STDOUT terraform:  + power_state = "active" 2025-03-23 20:09:31.845803 | orchestrator | 20:09:31.845 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.845808 | orchestrator | 20:09:31.845 STDOUT terraform:  + security_groups = (known after apply) 2025-03-23 20:09:31.845813 | orchestrator | 20:09:31.845 STDOUT terraform:  + stop_before_destroy = false 2025-03-23 20:09:31.845818 | orchestrator | 20:09:31.845 STDOUT terraform:  + updated = (known after apply) 2025-03-23 20:09:31.845823 | orchestrator | 20:09:31.845 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-23 20:09:31.845828 | orchestrator | 20:09:31.845 STDOUT terraform:  + block_device { 2025-03-23 20:09:31.845835 | orchestrator | 20:09:31.845 STDOUT terraform:  + boot_index = 0 2025-03-23 20:09:31.845847 | orchestrator | 20:09:31.845 STDOUT terraform:  + delete_on_termination = false 2025-03-23 20:09:31.845859 | orchestrator | 20:09:31.845 STDOUT terraform:  + destination_type = "volume" 2025-03-23 20:09:31.846286 | orchestrator | 20:09:31.845 STDOUT terraform:  + multiattach = false 2025-03-23 20:09:31.846313 | orchestrator | 20:09:31.845 STDOUT terraform:  + source_type = "volume" 2025-03-23 20:09:31.846323 | orchestrator | 20:09:31.845 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.846332 | orchestrator | 20:09:31.845 STDOUT terraform:  } 2025-03-23 20:09:31.846340 | orchestrator | 20:09:31.845 STDOUT terraform:  + network { 2025-03-23 20:09:31.846348 | orchestrator | 20:09:31.845 STDOUT terraform:  + access_network = false 2025-03-23 20:09:31.846363 | orchestrator | 20:09:31.845 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-23 20:09:31.846371 | orchestrator | 20:09:31.845 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-23 20:09:31.846380 | orchestrator | 20:09:31.845 STDOUT terraform:  + mac = (known after apply) 2025-03-23 20:09:31.846389 | orchestrator | 20:09:31.845 STDOUT terraform:  + name = (known after apply) 2025-03-23 20:09:31.846404 | orchestrator | 20:09:31.845 STDOUT terraform:  + port = (known after apply) 2025-03-23 20:09:31.846414 | orchestrator | 20:09:31.845 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.846424 | orchestrator | 20:09:31.845 STDOUT terraform:  } 2025-03-23 20:09:31.846432 | orchestrator | 20:09:31.845 STDOUT terraform:  } 2025-03-23 20:09:31.846442 | orchestrator | 20:09:31.845 STDOUT terraform:  # openstack_compute_instance_v2.node_server[5] will be created 2025-03-23 20:09:31.846451 | orchestrator | 20:09:31.845 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-23 20:09:31.846502 | orchestrator | 20:09:31.846 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-23 20:09:31.846511 | orchestrator | 20:09:31.846 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-23 20:09:31.846519 | orchestrator | 20:09:31.846 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-23 20:09:31.846527 | orchestrator | 20:09:31.846 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.846535 | orchestrator | 20:09:31.846 STDOUT terraform:  + availability_zone = "nova" 2025-03-23 20:09:31.846540 | orchestrator | 20:09:31.846 STDOUT terraform:  + config_drive = true 2025-03-23 20:09:31.846545 | orchestrator | 20:09:31.846 STDOUT terraform:  + created = (known after apply) 2025-03-23 20:09:31.846550 | orchestrator | 20:09:31.846 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-23 20:09:31.846559 | orchestrator | 20:09:31.846 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-23 20:09:31.846564 | orchestrator | 20:09:31.846 STDOUT terraform:  + force_delete = false 2025-03-23 20:09:31.846569 | orchestrator | 20:09:31.846 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.846576 | orchestrator | 20:09:31.846 STDOUT terraform:  + image_id = (known after apply) 2025-03-23 20:09:31.846591 | orchestrator | 20:09:31.846 STDOUT terraform:  + image_name = (known after apply) 2025-03-23 20:09:31.846599 | orchestrator | 20:09:31.846 STDOUT terraform:  + key_pair = "testbed" 2025-03-23 20:09:31.847089 | orchestrator | 20:09:31.846 STDOUT terraform:  + name = "testbed-node-5" 2025-03-23 20:09:31.847104 | orchestrator | 20:09:31.846 STDOUT terraform:  + power_state = "active" 2025-03-23 20:09:31.847109 | orchestrator | 20:09:31.846 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.847114 | orchestrator | 20:09:31.846 STDOUT terraform:  + security_groups = (known after apply) 2025-03-23 20:09:31.847119 | orchestrator | 20:09:31.846 STDOUT terraform:  + stop_before_destroy = false 2025-03-23 20:09:31.847124 | orchestrator | 20:09:31.846 STDOUT terraform:  + updated = (known after apply) 2025-03-23 20:09:31.847133 | orchestrator | 20:09:31.846 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-23 20:09:31.847138 | orchestrator | 20:09:31.846 STDOUT terraform:  + block_device { 2025-03-23 20:09:31.847143 | orchestrator | 20:09:31.846 STDOUT terraform:  + boot_index = 0 2025-03-23 20:09:31.847148 | orchestrator | 20:09:31.846 STDOUT terraform:  + delete_on_termination = false 2025-03-23 20:09:31.847153 | orchestrator | 20:09:31.846 STDOUT terraform:  + destination_type = "volume" 2025-03-23 20:09:31.847158 | orchestrator | 20:09:31.846 STDOUT terraform:  + multiattach = false 2025-03-23 20:09:31.847163 | orchestrator | 20:09:31.846 STDOUT terraform:  + source_type = "volume" 2025-03-23 20:09:31.847168 | orchestrator | 20:09:31.846 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.847173 | orchestrator | 20:09:31.846 STDOUT terraform:  } 2025-03-23 20:09:31.847178 | orchestrator | 20:09:31.846 STDOUT terraform:  + network { 2025-03-23 20:09:31.847184 | orchestrator | 20:09:31.846 STDOUT terraform:  + access_network = false 2025-03-23 20:09:31.847189 | orchestrator | 20:09:31.846 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-23 20:09:31.847193 | orchestrator | 20:09:31.846 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-23 20:09:31.847198 | orchestrator | 20:09:31.846 STDOUT terraform:  + mac = (known after apply) 2025-03-23 20:09:31.847203 | orchestrator | 20:09:31.846 STDOUT terraform:  + name = (known after apply) 2025-03-23 20:09:31.847208 | orchestrator | 20:09:31.846 STDOUT terraform:  + port = (known after apply) 2025-03-23 20:09:31.847213 | orchestrator | 20:09:31.846 STDOUT terraform:  + uuid = (known after apply) 2025-03-23 20:09:31.847218 | orchestrator | 20:09:31.846 STDOUT terraform:  } 2025-03-23 20:09:31.847223 | orchestrator | 20:09:31.847 STDOUT terraform:  } 2025-03-23 20:09:31.847233 | orchestrator | 20:09:31.847 STDOUT terraform:  # openstack_compute_keypair_v2.key will be created 2025-03-23 20:09:31.847239 | orchestrator | 20:09:31.847 STDOUT terraform:  + resource "openstack_compute_keypair_v2" "key" { 2025-03-23 20:09:31.847244 | orchestrator | 20:09:31.847 STDOUT terraform:  + fingerprint = (known after apply) 2025-03-23 20:09:31.847249 | orchestrator | 20:09:31.847 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.847263 | orchestrator | 20:09:31.847 STDOUT terraform:  + name = "testbed" 2025-03-23 20:09:31.847584 | orchestrator | 20:09:31.847 STDOUT terraform:  + private_key = (sensitive value) 2025-03-23 20:09:31.847593 | orchestrator | 20:09:31.847 STDOUT terraform:  + public_key = (known after apply) 2025-03-23 20:09:31.847598 | orchestrator | 20:09:31.847 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.847603 | orchestrator | 20:09:31.847 STDOUT terraform:  + user_id = (known after apply) 2025-03-23 20:09:31.847608 | orchestrator | 20:09:31.847 STDOUT terraform:  } 2025-03-23 20:09:31.847615 | orchestrator | 20:09:31.847 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[0] will be created 2025-03-23 20:09:31.847635 | orchestrator | 20:09:31.847 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.847641 | orchestrator | 20:09:31.847 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.847646 | orchestrator | 20:09:31.847 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.847651 | orchestrator | 20:09:31.847 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.847656 | orchestrator | 20:09:31.847 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.847661 | orchestrator | 20:09:31.847 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.847666 | orchestrator | 20:09:31.847 STDOUT terraform:  } 2025-03-23 20:09:31.847671 | orchestrator | 20:09:31.847 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[1] will be created 2025-03-23 20:09:31.847676 | orchestrator | 20:09:31.847 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.847682 | orchestrator | 20:09:31.847 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.847687 | orchestrator | 20:09:31.847 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.847703 | orchestrator | 20:09:31.847 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.847869 | orchestrator | 20:09:31.847 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.847882 | orchestrator | 20:09:31.847 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.847887 | orchestrator | 20:09:31.847 STDOUT terraform:  } 2025-03-23 20:09:31.847894 | orchestrator | 20:09:31.847 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[2] will be created 2025-03-23 20:09:31.847912 | orchestrator | 20:09:31.847 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.847918 | orchestrator | 20:09:31.847 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.847923 | orchestrator | 20:09:31.847 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.847928 | orchestrator | 20:09:31.847 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.847933 | orchestrator | 20:09:31.847 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.847940 | orchestrator | 20:09:31.847 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.848062 | orchestrator | 20:09:31.847 STDOUT terraform:  } 2025-03-23 20:09:31.848072 | orchestrator | 20:09:31.847 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[3] will be created 2025-03-23 20:09:31.848088 | orchestrator | 20:09:31.847 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.848094 | orchestrator | 20:09:31.848 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.848099 | orchestrator | 20:09:31.848 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.848105 | orchestrator | 20:09:31.848 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.848112 | orchestrator | 20:09:31.848 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.848142 | orchestrator | 20:09:31.848 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.848158 | orchestrator | 20:09:31.848 STDOUT terraform:  } 2025-03-23 20:09:31.848200 | orchestrator | 20:09:31.848 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[4] will be created 2025-03-23 20:09:31.848250 | orchestrator | 20:09:31.848 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.848277 | orchestrator | 20:09:31.848 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.848306 | orchestrator | 20:09:31.848 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.848342 | orchestrator | 20:09:31.848 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.848362 | orchestrator | 20:09:31.848 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.848390 | orchestrator | 20:09:31.848 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.848398 | orchestrator | 20:09:31.848 STDOUT terraform:  } 2025-03-23 20:09:31.848450 | orchestrator | 20:09:31.848 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[5] will be created 2025-03-23 20:09:31.848512 | orchestrator | 20:09:31.848 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.848534 | orchestrator | 20:09:31.848 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.848562 | orchestrator | 20:09:31.848 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.848599 | orchestrator | 20:09:31.848 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.848618 | orchestrator | 20:09:31.848 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.848649 | orchestrator | 20:09:31.848 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.848656 | orchestrator | 20:09:31.848 STDOUT terraform:  } 2025-03-23 20:09:31.848708 | orchestrator | 20:09:31.848 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[6] will be created 2025-03-23 20:09:31.848763 | orchestrator | 20:09:31.848 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.848783 | orchestrator | 20:09:31.848 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.848812 | orchestrator | 20:09:31.848 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.848840 | orchestrator | 20:09:31.848 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.848875 | orchestrator | 20:09:31.848 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.848896 | orchestrator | 20:09:31.848 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.848904 | orchestrator | 20:09:31.848 STDOUT terraform:  } 2025-03-23 20:09:31.848955 | orchestrator | 20:09:31.848 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[7] will be created 2025-03-23 20:09:31.849003 | orchestrator | 20:09:31.848 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.849031 | orchestrator | 20:09:31.848 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.849060 | orchestrator | 20:09:31.849 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.849087 | orchestrator | 20:09:31.849 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.849138 | orchestrator | 20:09:31.849 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.849147 | orchestrator | 20:09:31.849 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.849153 | orchestrator | 20:09:31.849 STDOUT terraform:  } 2025-03-23 20:09:31.849203 | orchestrator | 20:09:31.849 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[8] will be created 2025-03-23 20:09:31.849251 | orchestrator | 20:09:31.849 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.849279 | orchestrator | 20:09:31.849 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.849308 | orchestrator | 20:09:31.849 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.849335 | orchestrator | 20:09:31.849 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.849364 | orchestrator | 20:09:31.849 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.849393 | orchestrator | 20:09:31.849 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.849400 | orchestrator | 20:09:31.849 STDOUT terraform:  } 2025-03-23 20:09:31.849451 | orchestrator | 20:09:31.849 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[9] will be created 2025-03-23 20:09:31.849520 | orchestrator | 20:09:31.849 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.849548 | orchestrator | 20:09:31.849 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.849596 | orchestrator | 20:09:31.849 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.849604 | orchestrator | 20:09:31.849 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.849631 | orchestrator | 20:09:31.849 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.849658 | orchestrator | 20:09:31.849 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.849666 | orchestrator | 20:09:31.849 STDOUT terraform:  } 2025-03-23 20:09:31.849718 | orchestrator | 20:09:31.849 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[10] will be created 2025-03-23 20:09:31.849767 | orchestrator | 20:09:31.849 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.849794 | orchestrator | 20:09:31.849 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.849824 | orchestrator | 20:09:31.849 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.849852 | orchestrator | 20:09:31.849 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.849880 | orchestrator | 20:09:31.849 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.849909 | orchestrator | 20:09:31.849 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.849916 | orchestrator | 20:09:31.849 STDOUT terraform:  } 2025-03-23 20:09:31.849968 | orchestrator | 20:09:31.849 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[11] will be created 2025-03-23 20:09:31.850031 | orchestrator | 20:09:31.849 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.850103 | orchestrator | 20:09:31.850 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.850141 | orchestrator | 20:09:31.850 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.850172 | orchestrator | 20:09:31.850 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.850201 | orchestrator | 20:09:31.850 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.850229 | orchestrator | 20:09:31.850 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.850237 | orchestrator | 20:09:31.850 STDOUT terraform:  } 2025-03-23 20:09:31.850292 | orchestrator | 20:09:31.850 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[12] will be created 2025-03-23 20:09:31.850341 | orchestrator | 20:09:31.850 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.850371 | orchestrator | 20:09:31.850 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.850399 | orchestrator | 20:09:31.850 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.850427 | orchestrator | 20:09:31.850 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.850483 | orchestrator | 20:09:31.850 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.850492 | orchestrator | 20:09:31.850 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.850499 | orchestrator | 20:09:31.850 STDOUT terraform:  } 2025-03-23 20:09:31.850551 | orchestrator | 20:09:31.850 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[13] will be created 2025-03-23 20:09:31.850600 | orchestrator | 20:09:31.850 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.850628 | orchestrator | 20:09:31.850 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.850659 | orchestrator | 20:09:31.850 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.850685 | orchestrator | 20:09:31.850 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.850713 | orchestrator | 20:09:31.850 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.850741 | orchestrator | 20:09:31.850 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.850754 | orchestrator | 20:09:31.850 STDOUT terraform:  } 2025-03-23 20:09:31.850802 | orchestrator | 20:09:31.850 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[14] will be created 2025-03-23 20:09:31.850849 | orchestrator | 20:09:31.850 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.850877 | orchestrator | 20:09:31.850 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.850906 | orchestrator | 20:09:31.850 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.850936 | orchestrator | 20:09:31.850 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.850964 | orchestrator | 20:09:31.850 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.850993 | orchestrator | 20:09:31.850 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.851002 | orchestrator | 20:09:31.850 STDOUT terraform:  } 2025-03-23 20:09:31.851052 | orchestrator | 20:09:31.850 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[15] will be created 2025-03-23 20:09:31.851100 | orchestrator | 20:09:31.851 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.851128 | orchestrator | 20:09:31.851 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.851157 | orchestrator | 20:09:31.851 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.851185 | orchestrator | 20:09:31.851 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.851213 | orchestrator | 20:09:31.851 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.851242 | orchestrator | 20:09:31.851 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.851257 | orchestrator | 20:09:31.851 STDOUT terraform:  } 2025-03-23 20:09:31.851308 | orchestrator | 20:09:31.851 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[16] will be created 2025-03-23 20:09:31.851355 | orchestrator | 20:09:31.851 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.851385 | orchestrator | 20:09:31.851 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.851415 | orchestrator | 20:09:31.851 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.851443 | orchestrator | 20:09:31.851 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.851480 | orchestrator | 20:09:31.851 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.851508 | orchestrator | 20:09:31.851 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.851515 | orchestrator | 20:09:31.851 STDOUT terraform:  } 2025-03-23 20:09:31.851567 | orchestrator | 20:09:31.851 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[17] will be created 2025-03-23 20:09:31.851618 | orchestrator | 20:09:31.851 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-23 20:09:31.851656 | orchestrator | 20:09:31.851 STDOUT terraform:  + device = (known after apply) 2025-03-23 20:09:31.851675 | orchestrator | 20:09:31.851 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.851703 | orchestrator | 20:09:31.851 STDOUT terraform:  + instance_id = (known after apply) 2025-03-23 20:09:31.851731 | orchestrator | 20:09:31.851 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.851761 | orchestrator | 20:09:31.851 STDOUT terraform:  + volume_id = (known after apply) 2025-03-23 20:09:31.851767 | orchestrator | 20:09:31.851 STDOUT terraform:  } 2025-03-23 20:09:31.851825 | orchestrator | 20:09:31.851 STDOUT terraform:  # openstack_networking_floatingip_associate_v2.manager_floating_ip_association will be created 2025-03-23 20:09:31.851881 | orchestrator | 20:09:31.851 STDOUT terraform:  + resource "openstack_networking_floatingip_associate_v2" "manager_floating_ip_association" { 2025-03-23 20:09:31.851909 | orchestrator | 20:09:31.851 STDOUT terraform:  + fixed_ip = (known after apply) 2025-03-23 20:09:31.851937 | orchestrator | 20:09:31.851 STDOUT terraform:  + floating_ip = (known after apply) 2025-03-23 20:09:31.851966 | orchestrator | 20:09:31.851 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.851994 | orchestrator | 20:09:31.851 STDOUT terraform:  + port_id = (known after apply) 2025-03-23 20:09:31.852026 | orchestrator | 20:09:31.851 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.852034 | orchestrator | 20:09:31.852 STDOUT terraform:  } 2025-03-23 20:09:31.852079 | orchestrator | 20:09:31.852 STDOUT terraform:  # openstack_networking_floatingip_v2.manager_floating_ip will be created 2025-03-23 20:09:31.852126 | orchestrator | 20:09:31.852 STDOUT terraform:  + resource "openstack_networking_floatingip_v2" "manager_floating_ip" { 2025-03-23 20:09:31.852153 | orchestrator | 20:09:31.852 STDOUT terraform:  + address = (known after apply) 2025-03-23 20:09:31.852178 | orchestrator | 20:09:31.852 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.852203 | orchestrator | 20:09:31.852 STDOUT terraform:  + dns_domain = (known after apply) 2025-03-23 20:09:31.852229 | orchestrator | 20:09:31.852 STDOUT terraform:  + dns_name = (known after apply) 2025-03-23 20:09:31.852254 | orchestrator | 20:09:31.852 STDOUT terraform:  + fixed_ip = (known after apply) 2025-03-23 20:09:31.852279 | orchestrator | 20:09:31.852 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.852304 | orchestrator | 20:09:31.852 STDOUT terraform:  + pool = "public" 2025-03-23 20:09:31.852328 | orchestrator | 20:09:31.852 STDOUT terraform:  + port_id = (known after apply) 2025-03-23 20:09:31.852353 | orchestrator | 20:09:31.852 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.852377 | orchestrator | 20:09:31.852 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-23 20:09:31.852403 | orchestrator | 20:09:31.852 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.852410 | orchestrator | 20:09:31.852 STDOUT terraform:  } 2025-03-23 20:09:31.852483 | orchestrator | 20:09:31.852 STDOUT terraform:  # openstack_networking_network_v2.net_management will be created 2025-03-23 20:09:31.852508 | orchestrator | 20:09:31.852 STDOUT terraform:  + resource "openstack_networking_network_v2" "net_management" { 2025-03-23 20:09:31.852545 | orchestrator | 20:09:31.852 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-23 20:09:31.852583 | orchestrator | 20:09:31.852 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.852606 | orchestrator | 20:09:31.852 STDOUT terraform:  + availability_zone_hints = [ 2025-03-23 20:09:31.852621 | orchestrator | 20:09:31.852 STDOUT terraform:  + "nova", 2025-03-23 20:09:31.852630 | orchestrator | 20:09:31.852 STDOUT terraform:  ] 2025-03-23 20:09:31.852669 | orchestrator | 20:09:31.852 STDOUT terraform:  + dns_domain = (known after apply) 2025-03-23 20:09:31.852707 | orchestrator | 20:09:31.852 STDOUT terraform:  + external = (known after apply) 2025-03-23 20:09:31.852744 | orchestrator | 20:09:31.852 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.852782 | orchestrator | 20:09:31.852 STDOUT terraform:  + mtu = (known after apply) 2025-03-23 20:09:31.852821 | orchestrator | 20:09:31.852 STDOUT terraform:  + name = "net-testbed-management" 2025-03-23 20:09:31.852858 | orchestrator | 20:09:31.852 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-23 20:09:31.852895 | orchestrator | 20:09:31.852 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-23 20:09:31.852931 | orchestrator | 20:09:31.852 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.852968 | orchestrator | 20:09:31.852 STDOUT terraform:  + shared = (known after apply) 2025-03-23 20:09:31.853007 | orchestrator | 20:09:31.852 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.853042 | orchestrator | 20:09:31.853 STDOUT terraform:  + transparent_vlan = (known after apply) 2025-03-23 20:09:31.853067 | orchestrator | 20:09:31.853 STDOUT terraform:  + segments (known after apply) 2025-03-23 20:09:31.853073 | orchestrator | 20:09:31.853 STDOUT terraform:  } 2025-03-23 20:09:31.853122 | orchestrator | 20:09:31.853 STDOUT terraform:  # openstack_networking_port_v2.manager_port_management will be created 2025-03-23 20:09:31.853167 | orchestrator | 20:09:31.853 STDOUT terraform:  + resource "openstack_networking_port_v2" "manager_port_management" { 2025-03-23 20:09:31.853204 | orchestrator | 20:09:31.853 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-23 20:09:31.853240 | orchestrator | 20:09:31.853 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-23 20:09:31.853275 | orchestrator | 20:09:31.853 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-23 20:09:31.853312 | orchestrator | 20:09:31.853 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.853348 | orchestrator | 20:09:31.853 STDOUT terraform:  + device_id = (known after apply) 2025-03-23 20:09:31.853384 | orchestrator | 20:09:31.853 STDOUT terraform:  + device_owner = (known after apply) 2025-03-23 20:09:31.853420 | orchestrator | 20:09:31.853 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-23 20:09:31.853468 | orchestrator | 20:09:31.853 STDOUT terraform:  + dns_name = (known after apply) 2025-03-23 20:09:31.853517 | orchestrator | 20:09:31.853 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.853553 | orchestrator | 20:09:31.853 STDOUT terraform:  + mac_address = (known after apply) 2025-03-23 20:09:31.853594 | orchestrator | 20:09:31.853 STDOUT terraform:  + network_id = (known after apply) 2025-03-23 20:09:31.853627 | orchestrator | 20:09:31.853 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-23 20:09:31.853662 | orchestrator | 20:09:31.853 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-23 20:09:31.853698 | orchestrator | 20:09:31.853 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.853733 | orchestrator | 20:09:31.853 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-23 20:09:31.853770 | orchestrator | 20:09:31.853 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.853792 | orchestrator | 20:09:31.853 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.853820 | orchestrator | 20:09:31.853 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-23 20:09:31.853827 | orchestrator | 20:09:31.853 STDOUT terraform:  } 2025-03-23 20:09:31.853851 | orchestrator | 20:09:31.853 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.853879 | orchestrator | 20:09:31.853 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-23 20:09:31.853894 | orchestrator | 20:09:31.853 STDOUT terraform:  } 2025-03-23 20:09:31.853919 | orchestrator | 20:09:31.853 STDOUT terraform:  + binding (known after apply) 2025-03-23 20:09:31.853934 | orchestrator | 20:09:31.853 STDOUT terraform:  + fixed_ip { 2025-03-23 20:09:31.853961 | orchestrator | 20:09:31.853 STDOUT terraform:  + ip_address = "192.168.16.5" 2025-03-23 20:09:31.853989 | orchestrator | 20:09:31.853 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-23 20:09:31.853996 | orchestrator | 20:09:31.853 STDOUT terraform:  } 2025-03-23 20:09:31.854062 | orchestrator | 20:09:31.853 STDOUT terraform:  } 2025-03-23 20:09:31.854159 | orchestrator | 20:09:31.854 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[0] will be created 2025-03-23 20:09:31.854205 | orchestrator | 20:09:31.854 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-23 20:09:31.854243 | orchestrator | 20:09:31.854 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-23 20:09:31.854278 | orchestrator | 20:09:31.854 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-23 20:09:31.854315 | orchestrator | 20:09:31.854 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-23 20:09:31.854351 | orchestrator | 20:09:31.854 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.854389 | orchestrator | 20:09:31.854 STDOUT terraform:  + device_id = (known after apply) 2025-03-23 20:09:31.854424 | orchestrator | 20:09:31.854 STDOUT terraform:  + device_owner = (known after apply) 2025-03-23 20:09:31.854470 | orchestrator | 20:09:31.854 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-23 20:09:31.854505 | orchestrator | 20:09:31.854 STDOUT terraform:  + dns_name = (known after apply) 2025-03-23 20:09:31.854542 | orchestrator | 20:09:31.854 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.854579 | orchestrator | 20:09:31.854 STDOUT terraform:  + mac_address = (known after apply) 2025-03-23 20:09:31.854616 | orchestrator | 20:09:31.854 STDOUT terraform:  + network_id = (known after apply) 2025-03-23 20:09:31.854652 | orchestrator | 20:09:31.854 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-23 20:09:31.854687 | orchestrator | 20:09:31.854 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-23 20:09:31.854725 | orchestrator | 20:09:31.854 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.854760 | orchestrator | 20:09:31.854 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-23 20:09:31.854798 | orchestrator | 20:09:31.854 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.854818 | orchestrator | 20:09:31.854 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.854847 | orchestrator | 20:09:31.854 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-23 20:09:31.854854 | orchestrator | 20:09:31.854 STDOUT terraform:  } 2025-03-23 20:09:31.854884 | orchestrator | 20:09:31.854 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.854907 | orchestrator | 20:09:31.854 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-23 20:09:31.854923 | orchestrator | 20:09:31.854 STDOUT terraform:  } 2025-03-23 20:09:31.854943 | orchestrator | 20:09:31.854 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.854974 | orchestrator | 20:09:31.854 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-23 20:09:31.854981 | orchestrator | 20:09:31.854 STDOUT terraform:  } 2025-03-23 20:09:31.855002 | orchestrator | 20:09:31.854 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.855032 | orchestrator | 20:09:31.854 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-23 20:09:31.855046 | orchestrator | 20:09:31.855 STDOUT terraform:  } 2025-03-23 20:09:31.855072 | orchestrator | 20:09:31.855 STDOUT terraform:  + binding (known after apply) 2025-03-23 20:09:31.855080 | orchestrator | 20:09:31.855 STDOUT terraform:  + fixed_ip { 2025-03-23 20:09:31.855109 | orchestrator | 20:09:31.855 STDOUT terraform:  + ip_address = "192.168.16.10" 2025-03-23 20:09:31.855138 | orchestrator | 20:09:31.855 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-23 20:09:31.855144 | orchestrator | 20:09:31.855 STDOUT terraform:  } 2025-03-23 20:09:31.855169 | orchestrator | 20:09:31.855 STDOUT terraform:  } 2025-03-23 20:09:31.855208 | orchestrator | 20:09:31.855 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[1] will be created 2025-03-23 20:09:31.855260 | orchestrator | 20:09:31.855 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-23 20:09:31.855288 | orchestrator | 20:09:31.855 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-23 20:09:31.855324 | orchestrator | 20:09:31.855 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-23 20:09:31.855360 | orchestrator | 20:09:31.855 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-23 20:09:31.855398 | orchestrator | 20:09:31.855 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.855433 | orchestrator | 20:09:31.855 STDOUT terraform:  + device_id = (known after apply) 2025-03-23 20:09:31.855481 | orchestrator | 20:09:31.855 STDOUT terraform:  + device_owner = (known after apply) 2025-03-23 20:09:31.855520 | orchestrator | 20:09:31.855 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-23 20:09:31.855556 | orchestrator | 20:09:31.855 STDOUT terraform:  + dns_name = (known after apply) 2025-03-23 20:09:31.855593 | orchestrator | 20:09:31.855 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.855629 | orchestrator | 20:09:31.855 STDOUT terraform:  + mac_address = (known after apply) 2025-03-23 20:09:31.855667 | orchestrator | 20:09:31.855 STDOUT terraform:  + network_id = (known after apply) 2025-03-23 20:09:31.855701 | orchestrator | 20:09:31.855 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-23 20:09:31.855739 | orchestrator | 20:09:31.855 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-23 20:09:31.855775 | orchestrator | 20:09:31.855 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.855812 | orchestrator | 20:09:31.855 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-23 20:09:31.855848 | orchestrator | 20:09:31.855 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.855869 | orchestrator | 20:09:31.855 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.855905 | orchestrator | 20:09:31.855 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-23 20:09:31.855926 | orchestrator | 20:09:31.855 STDOUT terraform:  } 2025-03-23 20:09:31.855934 | orchestrator | 20:09:31.855 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.855957 | orchestrator | 20:09:31.855 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-23 20:09:31.855964 | orchestrator | 20:09:31.855 STDOUT terraform:  } 2025-03-23 20:09:31.855995 | orchestrator | 20:09:31.855 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.856015 | orchestrator | 20:09:31.855 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-23 20:09:31.856022 | orchestrator | 20:09:31.856 STDOUT terraform:  } 2025-03-23 20:09:31.856046 | orchestrator | 20:09:31.856 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.856083 | orchestrator | 20:09:31.856 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-23 20:09:31.856106 | orchestrator | 20:09:31.856 STDOUT terraform:  } 2025-03-23 20:09:31.856114 | orchestrator | 20:09:31.856 STDOUT terraform:  + binding (known after apply) 2025-03-23 20:09:31.856120 | orchestrator | 20:09:31.856 STDOUT terraform:  + fixed_ip { 2025-03-23 20:09:31.856144 | orchestrator | 20:09:31.856 STDOUT terraform:  + ip_address = "192.168.16.11" 2025-03-23 20:09:31.856174 | orchestrator | 20:09:31.856 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-23 20:09:31.856181 | orchestrator | 20:09:31.856 STDOUT terraform:  } 2025-03-23 20:09:31.856199 | orchestrator | 20:09:31.856 STDOUT terraform:  } 2025-03-23 20:09:31.856267 | orchestrator | 20:09:31.856 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[2] will be created 2025-03-23 20:09:31.856336 | orchestrator | 20:09:31.856 STDOUT terraform:  + resource "openstack_ne 2025-03-23 20:09:31.856356 | orchestrator | 20:09:31.856 STDOUT terraform: tworking_port_v2" "node_port_management" { 2025-03-23 20:09:31.856367 | orchestrator | 20:09:31.856 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-23 20:09:31.856403 | orchestrator | 20:09:31.856 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-23 20:09:31.856438 | orchestrator | 20:09:31.856 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-23 20:09:31.856498 | orchestrator | 20:09:31.856 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.856544 | orchestrator | 20:09:31.856 STDOUT terraform:  + device_id = (known after apply) 2025-03-23 20:09:31.856572 | orchestrator | 20:09:31.856 STDOUT terraform:  + device_owner = (known after apply) 2025-03-23 20:09:31.856608 | orchestrator | 20:09:31.856 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-23 20:09:31.856648 | orchestrator | 20:09:31.856 STDOUT terraform:  + dns_name = (known after apply) 2025-03-23 20:09:31.856684 | orchestrator | 20:09:31.856 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.856722 | orchestrator | 20:09:31.856 STDOUT terraform:  + mac_address = (known after apply) 2025-03-23 20:09:31.856759 | orchestrator | 20:09:31.856 STDOUT terraform:  + network_id = (known after apply) 2025-03-23 20:09:31.856796 | orchestrator | 20:09:31.856 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-23 20:09:31.856831 | orchestrator | 20:09:31.856 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-23 20:09:31.856870 | orchestrator | 20:09:31.856 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.856904 | orchestrator | 20:09:31.856 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-23 20:09:31.856940 | orchestrator | 20:09:31.856 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.856962 | orchestrator | 20:09:31.856 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.856992 | orchestrator | 20:09:31.856 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-23 20:09:31.856999 | orchestrator | 20:09:31.856 STDOUT terraform:  } 2025-03-23 20:09:31.857024 | orchestrator | 20:09:31.856 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.857053 | orchestrator | 20:09:31.857 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-23 20:09:31.857072 | orchestrator | 20:09:31.857 STDOUT terraform:  } 2025-03-23 20:09:31.857090 | orchestrator | 20:09:31.857 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.857119 | orchestrator | 20:09:31.857 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-23 20:09:31.857126 | orchestrator | 20:09:31.857 STDOUT terraform:  } 2025-03-23 20:09:31.857153 | orchestrator | 20:09:31.857 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.857181 | orchestrator | 20:09:31.857 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-23 20:09:31.857188 | orchestrator | 20:09:31.857 STDOUT terraform:  } 2025-03-23 20:09:31.857223 | orchestrator | 20:09:31.857 STDOUT terraform:  + binding (known after apply) 2025-03-23 20:09:31.857250 | orchestrator | 20:09:31.857 STDOUT terraform:  + fixed_ip { 2025-03-23 20:09:31.857261 | orchestrator | 20:09:31.857 STDOUT terraform:  + ip_address = "192.168.16.12" 2025-03-23 20:09:31.857284 | orchestrator | 20:09:31.857 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-23 20:09:31.857291 | orchestrator | 20:09:31.857 STDOUT terraform:  } 2025-03-23 20:09:31.857314 | orchestrator | 20:09:31.857 STDOUT terraform:  } 2025-03-23 20:09:31.857353 | orchestrator | 20:09:31.857 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[3] will be created 2025-03-23 20:09:31.857403 | orchestrator | 20:09:31.857 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-23 20:09:31.857434 | orchestrator | 20:09:31.857 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-23 20:09:31.857478 | orchestrator | 20:09:31.857 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-23 20:09:31.857513 | orchestrator | 20:09:31.857 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-23 20:09:31.857550 | orchestrator | 20:09:31.857 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.857586 | orchestrator | 20:09:31.857 STDOUT terraform:  + device_id = (known after apply) 2025-03-23 20:09:31.857622 | orchestrator | 20:09:31.857 STDOUT terraform:  + device_owner = (known after apply) 2025-03-23 20:09:31.857660 | orchestrator | 20:09:31.857 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-23 20:09:31.857694 | orchestrator | 20:09:31.857 STDOUT terraform:  + dns_name = (known after apply) 2025-03-23 20:09:31.857732 | orchestrator | 20:09:31.857 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.857769 | orchestrator | 20:09:31.857 STDOUT terraform:  + mac_address = (known after apply) 2025-03-23 20:09:31.857805 | orchestrator | 20:09:31.857 STDOUT terraform:  + network_id = (known after apply) 2025-03-23 20:09:31.857840 | orchestrator | 20:09:31.857 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-23 20:09:31.857875 | orchestrator | 20:09:31.857 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-23 20:09:31.857913 | orchestrator | 20:09:31.857 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.857948 | orchestrator | 20:09:31.857 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-23 20:09:31.857987 | orchestrator | 20:09:31.857 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.858006 | orchestrator | 20:09:31.857 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.859719 | orchestrator | 20:09:31.858 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-23 20:09:31.859756 | orchestrator | 20:09:31.859 STDOUT terraform:  } 2025-03-23 20:09:31.859765 | orchestrator | 20:09:31.859 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.859784 | orchestrator | 20:09:31.859 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-23 20:09:31.859804 | orchestrator | 20:09:31.859 STDOUT terraform:  } 2025-03-23 20:09:31.859811 | orchestrator | 20:09:31.859 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.859842 | orchestrator | 20:09:31.859 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-23 20:09:31.859857 | orchestrator | 20:09:31.859 STDOUT terraform:  } 2025-03-23 20:09:31.859864 | orchestrator | 20:09:31.859 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.859894 | orchestrator | 20:09:31.859 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-23 20:09:31.859901 | orchestrator | 20:09:31.859 STDOUT terraform:  } 2025-03-23 20:09:31.859936 | orchestrator | 20:09:31.859 STDOUT terraform:  + binding (known after apply) 2025-03-23 20:09:31.859943 | orchestrator | 20:09:31.859 STDOUT terraform:  + fixed_ip { 2025-03-23 20:09:31.859970 | orchestrator | 20:09:31.859 STDOUT terraform:  + ip_address = "192.168.16.13" 2025-03-23 20:09:31.860004 | orchestrator | 20:09:31.859 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-23 20:09:31.860011 | orchestrator | 20:09:31.859 STDOUT terraform:  } 2025-03-23 20:09:31.860029 | orchestrator | 20:09:31.860 STDOUT terraform:  } 2025-03-23 20:09:31.860087 | orchestrator | 20:09:31.860 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[4] will be created 2025-03-23 20:09:31.860123 | orchestrator | 20:09:31.860 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-23 20:09:31.860160 | orchestrator | 20:09:31.860 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-23 20:09:31.860196 | orchestrator | 20:09:31.860 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-23 20:09:31.860231 | orchestrator | 20:09:31.860 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-23 20:09:31.860267 | orchestrator | 20:09:31.860 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.860305 | orchestrator | 20:09:31.860 STDOUT terraform:  + device_id = (known after apply) 2025-03-23 20:09:31.860342 | orchestrator | 20:09:31.860 STDOUT terraform:  + device_owner = (known after apply) 2025-03-23 20:09:31.860379 | orchestrator | 20:09:31.860 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-23 20:09:31.860420 | orchestrator | 20:09:31.860 STDOUT terraform:  + dns_name = (known after apply) 2025-03-23 20:09:31.860482 | orchestrator | 20:09:31.860 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.860521 | orchestrator | 20:09:31.860 STDOUT terraform:  + mac_address = (known after apply) 2025-03-23 20:09:31.862191 | orchestrator | 20:09:31.862 STDOUT terraform:  + network_id = (known after apply) 2025-03-23 20:09:31.862225 | orchestrator | 20:09:31.862 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-23 20:09:31.862246 | orchestrator | 20:09:31.862 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-23 20:09:31.862283 | orchestrator | 20:09:31.862 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.862319 | orchestrator | 20:09:31.862 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-23 20:09:31.862359 | orchestrator | 20:09:31.862 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.862383 | orchestrator | 20:09:31.862 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.862415 | orchestrator | 20:09:31.862 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-23 20:09:31.862444 | orchestrator | 20:09:31.862 STDOUT terraform:  } 2025-03-23 20:09:31.862451 | orchestrator | 20:09:31.862 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.862508 | orchestrator | 20:09:31.862 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-23 20:09:31.862517 | orchestrator | 20:09:31.862 STDOUT terraform:  } 2025-03-23 20:09:31.862542 | orchestrator | 20:09:31.862 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.862574 | orchestrator | 20:09:31.862 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-23 20:09:31.862589 | orchestrator | 20:09:31.862 STDOUT terraform:  } 2025-03-23 20:09:31.862611 | orchestrator | 20:09:31.862 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.862641 | orchestrator | 20:09:31.862 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-23 20:09:31.862657 | orchestrator | 20:09:31.862 STDOUT terraform:  } 2025-03-23 20:09:31.862681 | orchestrator | 20:09:31.862 STDOUT terraform:  + binding (known after apply) 2025-03-23 20:09:31.862697 | orchestrator | 20:09:31.862 STDOUT terraform:  + fixed_ip { 2025-03-23 20:09:31.862724 | orchestrator | 20:09:31.862 STDOUT terraform:  + ip_address = "192.168.16.14" 2025-03-23 20:09:31.862755 | orchestrator | 20:09:31.862 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-23 20:09:31.862763 | orchestrator | 20:09:31.862 STDOUT terraform:  } 2025-03-23 20:09:31.862781 | orchestrator | 20:09:31.862 STDOUT terraform:  } 2025-03-23 20:09:31.862828 | orchestrator | 20:09:31.862 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[5] will be created 2025-03-23 20:09:31.862873 | orchestrator | 20:09:31.862 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-23 20:09:31.862909 | orchestrator | 20:09:31.862 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-23 20:09:31.862946 | orchestrator | 20:09:31.862 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-23 20:09:31.862981 | orchestrator | 20:09:31.862 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-23 20:09:31.863018 | orchestrator | 20:09:31.862 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.863054 | orchestrator | 20:09:31.863 STDOUT terraform:  + device_id = (known after apply) 2025-03-23 20:09:31.863090 | orchestrator | 20:09:31.863 STDOUT terraform:  + device_owner = (known after apply) 2025-03-23 20:09:31.863126 | orchestrator | 20:09:31.863 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-23 20:09:31.863164 | orchestrator | 20:09:31.863 STDOUT terraform:  + dns_name = (known after apply) 2025-03-23 20:09:31.863204 | orchestrator | 20:09:31.863 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.863239 | orchestrator | 20:09:31.863 STDOUT terraform:  + mac_address = (known after apply) 2025-03-23 20:09:31.863275 | orchestrator | 20:09:31.863 STDOUT terraform:  + network_id = (known after apply) 2025-03-23 20:09:31.863311 | orchestrator | 20:09:31.863 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-23 20:09:31.863349 | orchestrator | 20:09:31.863 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-23 20:09:31.863389 | orchestrator | 20:09:31.863 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.863424 | orchestrator | 20:09:31.863 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-23 20:09:31.863474 | orchestrator | 20:09:31.863 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.863486 | orchestrator | 20:09:31.863 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.864503 | orchestrator | 20:09:31.863 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-23 20:09:31.864549 | orchestrator | 20:09:31.863 STDOUT terraform:  } 2025-03-23 20:09:31.864555 | orchestrator | 20:09:31.863 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.864560 | orchestrator | 20:09:31.863 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-23 20:09:31.864565 | orchestrator | 20:09:31.863 STDOUT terraform:  } 2025-03-23 20:09:31.864571 | orchestrator | 20:09:31.863 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.864576 | orchestrator | 20:09:31.863 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-23 20:09:31.864581 | orchestrator | 20:09:31.863 STDOUT terraform:  } 2025-03-23 20:09:31.864586 | orchestrator | 20:09:31.863 STDOUT terraform:  + allowed_address_pairs { 2025-03-23 20:09:31.864591 | orchestrator | 20:09:31.863 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-23 20:09:31.864597 | orchestrator | 20:09:31.863 STDOUT terraform:  } 2025-03-23 20:09:31.864602 | orchestrator | 20:09:31.863 STDOUT terraform:  + binding (known after apply) 2025-03-23 20:09:31.864607 | orchestrator | 20:09:31.863 STDOUT terraform:  + fixed_ip { 2025-03-23 20:09:31.864617 | orchestrator | 20:09:31.863 STDOUT terraform:  + ip_address = "192.168.16.15" 2025-03-23 20:09:31.864622 | orchestrator | 20:09:31.863 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-23 20:09:31.864627 | orchestrator | 20:09:31.863 STDOUT terraform:  } 2025-03-23 20:09:31.864632 | orchestrator | 20:09:31.863 STDOUT terraform:  } 2025-03-23 20:09:31.864638 | orchestrator | 20:09:31.863 STDOUT terraform:  # openstack_networking_router_interface_v2.router_interface will be created 2025-03-23 20:09:31.864643 | orchestrator | 20:09:31.863 STDOUT terraform:  + resource "openstack_networking_router_interface_v2" "router_interface" { 2025-03-23 20:09:31.864648 | orchestrator | 20:09:31.863 STDOUT terraform:  + force_destroy = false 2025-03-23 20:09:31.864654 | orchestrator | 20:09:31.863 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.864661 | orchestrator | 20:09:31.863 STDOUT terraform:  + port_id = (known after apply) 2025-03-23 20:09:31.864666 | orchestrator | 20:09:31.863 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.864672 | orchestrator | 20:09:31.863 STDOUT terraform:  + router_id = (known after apply) 2025-03-23 20:09:31.864677 | orchestrator | 20:09:31.863 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-23 20:09:31.864682 | orchestrator | 20:09:31.863 STDOUT terraform:  } 2025-03-23 20:09:31.864687 | orchestrator | 20:09:31.863 STDOUT terraform:  # openstack_networking_router_v2.router will be created 2025-03-23 20:09:31.864699 | orchestrator | 20:09:31.864 STDOUT terraform:  + resource "openstack_networking_router_v2" "router" { 2025-03-23 20:09:31.864704 | orchestrator | 20:09:31.864 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-23 20:09:31.864709 | orchestrator | 20:09:31.864 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.864715 | orchestrator | 20:09:31.864 STDOUT terraform:  + availability_zone_hints = [ 2025-03-23 20:09:31.864720 | orchestrator | 20:09:31.864 STDOUT terraform:  + "nova", 2025-03-23 20:09:31.864725 | orchestrator | 20:09:31.864 STDOUT terraform:  ] 2025-03-23 20:09:31.864730 | orchestrator | 20:09:31.864 STDOUT terraform:  + distributed = (known after apply) 2025-03-23 20:09:31.864735 | orchestrator | 20:09:31.864 STDOUT terraform:  + enable_snat = (known after apply) 2025-03-23 20:09:31.864740 | orchestrator | 20:09:31.864 STDOUT terraform:  + external_network_id = "e6be7364-bfd8-4de7-8120-8f41c69a139a" 2025-03-23 20:09:31.864745 | orchestrator | 20:09:31.864 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.864750 | orchestrator | 20:09:31.864 STDOUT terraform:  + name = "testbed" 2025-03-23 20:09:31.864755 | orchestrator | 20:09:31.864 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.864765 | orchestrator | 20:09:31.864 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.865029 | orchestrator | 20:09:31.864 STDOUT terraform:  + external_fixed_ip (known after apply) 2025-03-23 20:09:31.865039 | orchestrator | 20:09:31.864 STDOUT terraform:  } 2025-03-23 20:09:31.865044 | orchestrator | 20:09:31.864 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule1 will be created 2025-03-23 20:09:31.865050 | orchestrator | 20:09:31.864 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule1" { 2025-03-23 20:09:31.865055 | orchestrator | 20:09:31.864 STDOUT terraform:  + description = "ssh" 2025-03-23 20:09:31.865060 | orchestrator | 20:09:31.864 STDOUT terraform:  + direction = "ingress" 2025-03-23 20:09:31.865065 | orchestrator | 20:09:31.864 STDOUT terraform:  + ethertype = "IPv4" 2025-03-23 20:09:31.865070 | orchestrator | 20:09:31.864 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.865075 | orchestrator | 20:09:31.864 STDOUT terraform:  + port_range_max = 22 2025-03-23 20:09:31.865080 | orchestrator | 20:09:31.864 STDOUT terraform:  + port_range_min = 22 2025-03-23 20:09:31.865086 | orchestrator | 20:09:31.864 STDOUT terraform:  + protocol = "tcp" 2025-03-23 20:09:31.865091 | orchestrator | 20:09:31.864 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.865095 | orchestrator | 20:09:31.864 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-23 20:09:31.865100 | orchestrator | 20:09:31.864 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-23 20:09:31.865108 | orchestrator | 20:09:31.864 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-23 20:09:31.865113 | orchestrator | 20:09:31.864 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.865118 | orchestrator | 20:09:31.864 STDOUT terraform:  } 2025-03-23 20:09:31.865128 | orchestrator | 20:09:31.864 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule2 will be created 2025-03-23 20:09:31.865134 | orchestrator | 20:09:31.864 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule2" { 2025-03-23 20:09:31.865139 | orchestrator | 20:09:31.864 STDOUT terraform:  + description = "wireguard" 2025-03-23 20:09:31.865144 | orchestrator | 20:09:31.864 STDOUT terraform:  + direction = "ingress" 2025-03-23 20:09:31.865149 | orchestrator | 20:09:31.864 STDOUT terraform:  + ethertype = "IPv4" 2025-03-23 20:09:31.865153 | orchestrator | 20:09:31.864 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.865158 | orchestrator | 20:09:31.865 STDOUT terraform:  + port_range_max = 51820 2025-03-23 20:09:31.865164 | orchestrator | 20:09:31.865 STDOUT terraform:  + port_range_min = 51820 2025-03-23 20:09:31.865168 | orchestrator | 20:09:31.865 STDOUT terraform:  + protocol = "udp" 2025-03-23 20:09:31.865173 | orchestrator | 20:09:31.865 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.865180 | orchestrator | 20:09:31.865 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-23 20:09:31.865185 | orchestrator | 20:09:31.865 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-23 20:09:31.865190 | orchestrator | 20:09:31.865 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-23 20:09:31.865196 | orchestrator | 20:09:31.865 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.869779 | orchestrator | 20:09:31.865 STDOUT terraform:  } 2025-03-23 20:09:31.869821 | orchestrator | 20:09:31.865 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule3 will be created 2025-03-23 20:09:31.869828 | orchestrator | 20:09:31.865 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule3" { 2025-03-23 20:09:31.869834 | orchestrator | 20:09:31.865 STDOUT terraform:  + direction = "ingress" 2025-03-23 20:09:31.869839 | orchestrator | 20:09:31.865 STDOUT terraform:  + ethertype = "IPv4" 2025-03-23 20:09:31.869844 | orchestrator | 20:09:31.865 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.869849 | orchestrator | 20:09:31.865 STDOUT terraform:  + protocol = "tcp" 2025-03-23 20:09:31.869854 | orchestrator | 20:09:31.865 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.869859 | orchestrator | 20:09:31.865 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-23 20:09:31.869864 | orchestrator | 20:09:31.865 STDOUT terraform:  + remote_ip_prefix = "192.168.16.0/20" 2025-03-23 20:09:31.869869 | orchestrator | 20:09:31.865 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-23 20:09:31.869874 | orchestrator | 20:09:31.865 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.869879 | orchestrator | 20:09:31.865 STDOUT terraform:  } 2025-03-23 20:09:31.869884 | orchestrator | 20:09:31.865 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule4 will be created 2025-03-23 20:09:31.869888 | orchestrator | 20:09:31.865 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule4" { 2025-03-23 20:09:31.869904 | orchestrator | 20:09:31.865 STDOUT terraform:  + direction = "ingress" 2025-03-23 20:09:31.869909 | orchestrator | 20:09:31.865 STDOUT terraform:  + ethertype = "IPv4" 2025-03-23 20:09:31.869914 | orchestrator | 20:09:31.865 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.869923 | orchestrator | 20:09:31.865 STDOUT terraform:  + protocol = "udp" 2025-03-23 20:09:31.869931 | orchestrator | 20:09:31.865 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.869936 | orchestrator | 20:09:31.865 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-23 20:09:31.869941 | orchestrator | 20:09:31.865 STDOUT terraform:  + remote_ip_prefix = "192.168.16.0/20" 2025-03-23 20:09:31.869946 | orchestrator | 20:09:31.865 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-23 20:09:31.869951 | orchestrator | 20:09:31.865 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.869956 | orchestrator | 20:09:31.865 STDOUT terraform:  } 2025-03-23 20:09:31.869961 | orchestrator | 20:09:31.865 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule5 will be created 2025-03-23 20:09:31.869965 | orchestrator | 20:09:31.865 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule5" { 2025-03-23 20:09:31.869973 | orchestrator | 20:09:31.865 STDOUT terraform:  + direction = "ingress" 2025-03-23 20:09:31.869978 | orchestrator | 20:09:31.866 STDOUT terraform:  + ethertype = "IPv4" 2025-03-23 20:09:31.869983 | orchestrator | 20:09:31.866 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.869988 | orchestrator | 20:09:31.866 STDOUT terraform:  + protocol = "icmp" 2025-03-23 20:09:31.869993 | orchestrator | 20:09:31.866 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.869998 | orchestrator | 20:09:31.866 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-23 20:09:31.870003 | orchestrator | 20:09:31.866 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-23 20:09:31.870008 | orchestrator | 20:09:31.866 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-23 20:09:31.870027 | orchestrator | 20:09:31.866 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.870033 | orchestrator | 20:09:31.866 STDOUT terraform:  } 2025-03-23 20:09:31.870043 | orchestrator | 20:09:31.866 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule1 will be created 2025-03-23 20:09:31.870049 | orchestrator | 20:09:31.866 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule1" { 2025-03-23 20:09:31.870054 | orchestrator | 20:09:31.866 STDOUT terraform:  + direction = "ingress" 2025-03-23 20:09:31.870059 | orchestrator | 20:09:31.866 STDOUT terraform:  + ethertype = "IPv4" 2025-03-23 20:09:31.870064 | orchestrator | 20:09:31.866 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.870069 | orchestrator | 20:09:31.866 STDOUT terraform:  + protocol = "tcp" 2025-03-23 20:09:31.870074 | orchestrator | 20:09:31.866 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.870082 | orchestrator | 20:09:31.866 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-23 20:09:31.870087 | orchestrator | 20:09:31.866 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-23 20:09:31.870092 | orchestrator | 20:09:31.866 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-23 20:09:31.870097 | orchestrator | 20:09:31.866 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.870102 | orchestrator | 20:09:31.866 STDOUT terraform:  } 2025-03-23 20:09:31.870107 | orchestrator | 20:09:31.866 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule2 will be created 2025-03-23 20:09:31.870112 | orchestrator | 20:09:31.866 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule2" { 2025-03-23 20:09:31.870117 | orchestrator | 20:09:31.866 STDOUT terraform:  + direction = "ingress" 2025-03-23 20:09:31.870122 | orchestrator | 20:09:31.866 STDOUT terraform:  + ethertype = "IPv4" 2025-03-23 20:09:31.870126 | orchestrator | 20:09:31.866 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.870131 | orchestrator | 20:09:31.866 STDOUT terraform:  + protocol = "udp" 2025-03-23 20:09:31.870136 | orchestrator | 20:09:31.866 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.870141 | orchestrator | 20:09:31.866 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-23 20:09:31.870146 | orchestrator | 20:09:31.866 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-23 20:09:31.870151 | orchestrator | 20:09:31.866 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-23 20:09:31.870156 | orchestrator | 20:09:31.866 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.870161 | orchestrator | 20:09:31.866 STDOUT terraform:  } 2025-03-23 20:09:31.870166 | orchestrator | 20:09:31.866 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule3 will be created 2025-03-23 20:09:31.870171 | orchestrator | 20:09:31.866 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule3" { 2025-03-23 20:09:31.870176 | orchestrator | 20:09:31.866 STDOUT terraform:  + direction = "ingress" 2025-03-23 20:09:31.870181 | orchestrator | 20:09:31.867 STDOUT terraform:  + ethertype = "IPv4" 2025-03-23 20:09:31.870186 | orchestrator | 20:09:31.867 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.870190 | orchestrator | 20:09:31.867 STDOUT terraform:  + protocol = "icmp" 2025-03-23 20:09:31.870195 | orchestrator | 20:09:31.867 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.870200 | orchestrator | 20:09:31.867 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-23 20:09:31.870205 | orchestrator | 20:09:31.867 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-23 20:09:31.870210 | orchestrator | 20:09:31.867 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-23 20:09:31.870215 | orchestrator | 20:09:31.867 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.870220 | orchestrator | 20:09:31.867 STDOUT terraform:  } 2025-03-23 20:09:31.870225 | orchestrator | 20:09:31.867 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_rule_vrrp will be created 2025-03-23 20:09:31.870235 | orchestrator | 20:09:31.867 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_rule_vrrp" { 2025-03-23 20:09:31.870241 | orchestrator | 20:09:31.867 STDOUT terraform:  + description = "vrrp" 2025-03-23 20:09:31.870246 | orchestrator | 20:09:31.867 STDOUT terraform:  + direction = "ingress" 2025-03-23 20:09:31.870251 | orchestrator | 20:09:31.867 STDOUT terraform:  + ethertype = "IPv4" 2025-03-23 20:09:31.870256 | orchestrator | 20:09:31.867 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.870261 | orchestrator | 20:09:31.867 STDOUT terraform:  + protocol = "112" 2025-03-23 20:09:31.870266 | orchestrator | 20:09:31.867 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.870276 | orchestrator | 20:09:31.867 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-23 20:09:31.870281 | orchestrator | 20:09:31.867 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-23 20:09:31.870286 | orchestrator | 20:09:31.867 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-23 20:09:31.870291 | orchestrator | 20:09:31.867 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.870296 | orchestrator | 20:09:31.867 STDOUT terraform:  } 2025-03-23 20:09:31.870301 | orchestrator | 20:09:31.867 STDOUT terraform:  # openstack_networking_secgroup_v2.security_group_management will be created 2025-03-23 20:09:31.870306 | orchestrator | 20:09:31.867 STDOUT terraform:  + resource "openstack_networking_secgroup_v2" "security_group_management" { 2025-03-23 20:09:31.870311 | orchestrator | 20:09:31.867 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.870316 | orchestrator | 20:09:31.867 STDOUT terraform:  + description = "management security group" 2025-03-23 20:09:31.870321 | orchestrator | 20:09:31.867 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.870326 | orchestrator | 20:09:31.867 STDOUT terraform:  + name = "testbed-management" 2025-03-23 20:09:31.870331 | orchestrator | 20:09:31.867 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.870336 | orchestrator | 20:09:31.867 STDOUT terraform:  + stateful = (known after apply) 2025-03-23 20:09:31.870343 | orchestrator | 20:09:31.867 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.870348 | orchestrator | 20:09:31.867 STDOUT terraform:  } 2025-03-23 20:09:31.870353 | orchestrator | 20:09:31.867 STDOUT terraform:  # openstack_networking_secgroup_v2.security_group_node will be created 2025-03-23 20:09:31.870358 | orchestrator | 20:09:31.867 STDOUT terraform:  + resource "openstack_networking_secgroup_v2" "security_group_node" { 2025-03-23 20:09:31.870363 | orchestrator | 20:09:31.867 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.870368 | orchestrator | 20:09:31.867 STDOUT terraform:  + description = "node security group" 2025-03-23 20:09:31.870373 | orchestrator | 20:09:31.867 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:31.870378 | orchestrator | 20:09:31.867 STDOUT terraform:  + name = "testbed-node" 2025-03-23 20:09:31.870383 | orchestrator | 20:09:31.867 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:31.870390 | orchestrator | 20:09:31.867 STDOUT terraform:  + stateful = (known after apply) 2025-03-23 20:09:31.870395 | orchestrator | 20:09:31.868 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:31.870400 | orchestrator | 20:09:31.868 STDOUT terraform:  } 2025-03-23 20:09:31.870405 | orchestrator | 20:09:31.868 STDOUT terraform:  # openstack_networking_subnet_v2.subnet_management will be created 2025-03-23 20:09:31.870410 | orchestrator | 20:09:31.868 STDOUT terraform:  + resource "openstack_networking_subnet_v2" "subnet_management" { 2025-03-23 20:09:31.870415 | orchestrator | 20:09:31.868 STDOUT terraform:  + all_tags = (known after apply) 2025-03-23 20:09:31.870420 | orchestrator | 20:09:31.868 STDOUT terraform:  + cidr = "192.168.16.0/20" 2025-03-23 20:09:31.870425 | orchestrator | 20:09:31.868 STDOUT terraform:  + dns_nameservers = [ 2025-03-23 20:09:31.870432 | orchestrator | 20:09:31.868 STDOUT terraform:  + "8.8.8.8", 2025-03-23 20:09:32.052659 | orchestrator | 20:09:31.868 STDOUT terraform:  + "9.9.9.9", 2025-03-23 20:09:32.052721 | orchestrator | 20:09:31.868 STDOUT terraform:  ] 2025-03-23 20:09:32.052729 | orchestrator | 20:09:31.868 STDOUT terraform:  + enable_dhcp = true 2025-03-23 20:09:32.052735 | orchestrator | 20:09:31.868 STDOUT terraform:  + gateway_ip = (known after apply) 2025-03-23 20:09:32.052743 | orchestrator | 20:09:31.868 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:32.052748 | orchestrator | 20:09:31.868 STDOUT terraform:  + ip_version = 4 2025-03-23 20:09:32.052754 | orchestrator | 20:09:31.868 STDOUT terraform:  + ipv6_address_mode = (known after apply) 2025-03-23 20:09:32.052760 | orchestrator | 20:09:31.868 STDOUT terraform:  + ipv6_ra_mode = (known after apply) 2025-03-23 20:09:32.052766 | orchestrator | 20:09:31.868 STDOUT terraform:  + name = "subnet-testbed-management" 2025-03-23 20:09:32.052772 | orchestrator | 20:09:31.868 STDOUT terraform:  + network_id = (known after apply) 2025-03-23 20:09:32.052778 | orchestrator | 20:09:31.868 STDOUT terraform:  + no_gateway = false 2025-03-23 20:09:32.052783 | orchestrator | 20:09:31.868 STDOUT terraform:  + region = (known after apply) 2025-03-23 20:09:32.052788 | orchestrator | 20:09:31.868 STDOUT terraform:  + service_types = (known after apply) 2025-03-23 20:09:32.052794 | orchestrator | 20:09:31.868 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-23 20:09:32.052799 | orchestrator | 20:09:31.868 STDOUT terraform:  + allocation_pool { 2025-03-23 20:09:32.052805 | orchestrator | 20:09:31.868 STDOUT terraform:  + end = "192.168.31.250" 2025-03-23 20:09:32.052810 | orchestrator | 20:09:31.868 STDOUT terraform:  + start = "192.168.31.200" 2025-03-23 20:09:32.052816 | orchestrator | 20:09:31.868 STDOUT terraform:  } 2025-03-23 20:09:32.052821 | orchestrator | 20:09:31.868 STDOUT terraform:  } 2025-03-23 20:09:32.052827 | orchestrator | 20:09:31.868 STDOUT terraform:  # terraform_data.image will be created 2025-03-23 20:09:32.052832 | orchestrator | 20:09:31.868 STDOUT terraform:  + resource "terraform_data" "image" { 2025-03-23 20:09:32.052838 | orchestrator | 20:09:31.868 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:32.052852 | orchestrator | 20:09:31.868 STDOUT terraform:  + input = "Ubuntu 24.04" 2025-03-23 20:09:32.052858 | orchestrator | 20:09:31.868 STDOUT terraform:  + output = (known after apply) 2025-03-23 20:09:32.052863 | orchestrator | 20:09:31.868 STDOUT terraform:  } 2025-03-23 20:09:32.052869 | orchestrator | 20:09:31.868 STDOUT terraform:  # terraform_data.image_node will be created 2025-03-23 20:09:32.052874 | orchestrator | 20:09:31.868 STDOUT terraform:  + resource "terraform_data" "image_node" { 2025-03-23 20:09:32.052880 | orchestrator | 20:09:31.868 STDOUT terraform:  + id = (known after apply) 2025-03-23 20:09:32.052885 | orchestrator | 20:09:31.868 STDOUT terraform:  + input = "Ubuntu 24.04" 2025-03-23 20:09:32.052890 | orchestrator | 20:09:31.868 STDOUT terraform:  + output = (known after apply) 2025-03-23 20:09:32.052896 | orchestrator | 20:09:31.868 STDOUT terraform:  } 2025-03-23 20:09:32.052901 | orchestrator | 20:09:31.868 STDOUT terraform: Plan: 82 to add, 0 to change, 0 to destroy. 2025-03-23 20:09:32.052907 | orchestrator | 20:09:31.868 STDOUT terraform: Changes to Outputs: 2025-03-23 20:09:32.052912 | orchestrator | 20:09:31.868 STDOUT terraform:  + manager_address = (sensitive value) 2025-03-23 20:09:32.052918 | orchestrator | 20:09:31.868 STDOUT terraform:  + private_key = (sensitive value) 2025-03-23 20:09:32.052930 | orchestrator | 20:09:32.051 STDOUT terraform: terraform_data.image: Creating... 2025-03-23 20:09:32.067629 | orchestrator | 20:09:32.051 STDOUT terraform: terraform_data.image_node: Creating... 2025-03-23 20:09:32.067685 | orchestrator | 20:09:32.051 STDOUT terraform: terraform_data.image: Creation complete after 0s [id=dc081f2a-0439-8dd6-f908-af4774a64170] 2025-03-23 20:09:32.067693 | orchestrator | 20:09:32.052 STDOUT terraform: terraform_data.image_node: Creation complete after 0s [id=d3208904-b0bd-dd08-d3bc-aba1bae7ef9b] 2025-03-23 20:09:32.067705 | orchestrator | 20:09:32.067 STDOUT terraform: data.openstack_images_image_v2.image: Reading... 2025-03-23 20:09:32.086566 | orchestrator | 20:09:32.086 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[2]: Creating... 2025-03-23 20:09:32.498540 | orchestrator | 20:09:32.086 STDOUT terraform: openstack_networking_network_v2.net_management: Creating... 2025-03-23 20:09:32.498606 | orchestrator | 20:09:32.086 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[9]: Creating... 2025-03-23 20:09:32.498623 | orchestrator | 20:09:32.086 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[5]: Creating... 2025-03-23 20:09:32.498700 | orchestrator | 20:09:32.086 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[14]: Creating... 2025-03-23 20:09:32.498719 | orchestrator | 20:09:32.086 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[16]: Creating... 2025-03-23 20:09:32.498726 | orchestrator | 20:09:32.086 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[10]: Creating... 2025-03-23 20:09:32.498733 | orchestrator | 20:09:32.086 STDOUT terraform: openstack_compute_keypair_v2.key: Creating... 2025-03-23 20:09:32.498741 | orchestrator | 20:09:32.086 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[1]: Creating... 2025-03-23 20:09:32.498755 | orchestrator | 20:09:32.498 STDOUT terraform: data.openstack_images_image_v2.image: Read complete after 0s [id=cd9ae1ce-c4eb-4380-9087-2aa040df6990] 2025-03-23 20:09:32.506155 | orchestrator | 20:09:32.505 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[13]: Creating... 2025-03-23 20:09:32.778333 | orchestrator | 20:09:32.778 STDOUT terraform: openstack_compute_keypair_v2.key: Creation complete after 1s [id=testbed] 2025-03-23 20:09:32.786317 | orchestrator | 20:09:32.786 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[15]: Creating... 2025-03-23 20:09:38.004708 | orchestrator | 20:09:38.004 STDOUT terraform: openstack_networking_network_v2.net_management: Creation complete after 6s [id=8f34503c-207e-4751-a047-995337789365] 2025-03-23 20:09:38.011501 | orchestrator | 20:09:38.011 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[4]: Creating... 2025-03-23 20:09:42.084874 | orchestrator | 20:09:42.084 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[2]: Still creating... [10s elapsed] 2025-03-23 20:09:42.084993 | orchestrator | 20:09:42.084 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[9]: Still creating... [10s elapsed] 2025-03-23 20:09:42.085077 | orchestrator | 20:09:42.084 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[5]: Still creating... [10s elapsed] 2025-03-23 20:09:42.085755 | orchestrator | 20:09:42.085 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[16]: Still creating... [10s elapsed] 2025-03-23 20:09:42.085879 | orchestrator | 20:09:42.085 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[10]: Still creating... [10s elapsed] 2025-03-23 20:09:42.086072 | orchestrator | 20:09:42.085 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[14]: Still creating... [10s elapsed] 2025-03-23 20:09:42.086890 | orchestrator | 20:09:42.086 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[1]: Still creating... [10s elapsed] 2025-03-23 20:09:42.506675 | orchestrator | 20:09:42.506 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[13]: Still creating... [10s elapsed] 2025-03-23 20:09:42.685896 | orchestrator | 20:09:42.685 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[1]: Creation complete after 11s [id=ec0eaa49-f31e-4f34-9bf5-00a65ad435ca] 2025-03-23 20:09:42.689270 | orchestrator | 20:09:42.688 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[5]: Creation complete after 11s [id=1aae855b-0878-455b-beee-c51e17b854da] 2025-03-23 20:09:42.699508 | orchestrator | 20:09:42.699 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[3]: Creating... 2025-03-23 20:09:42.705175 | orchestrator | 20:09:42.704 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[0]: Creating... 2025-03-23 20:09:42.714009 | orchestrator | 20:09:42.713 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[16]: Creation complete after 11s [id=f00ea6e7-d866-4241-9ba8-a6f4135a6384] 2025-03-23 20:09:42.720424 | orchestrator | 20:09:42.720 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[14]: Creation complete after 11s [id=1e600cae-effb-4d1d-924b-3d561c8b4c34] 2025-03-23 20:09:42.721870 | orchestrator | 20:09:42.721 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[6]: Creating... 2025-03-23 20:09:42.722069 | orchestrator | 20:09:42.721 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[2]: Creation complete after 11s [id=81262b7a-e68a-468c-9a1b-ebbf4a5ddecf] 2025-03-23 20:09:42.726143 | orchestrator | 20:09:42.725 STDOUT terraform: data.openstack_images_image_v2.image_node: Reading... 2025-03-23 20:09:42.728968 | orchestrator | 20:09:42.728 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[12]: Creating... 2025-03-23 20:09:42.747901 | orchestrator | 20:09:42.747 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[9]: Creation complete after 11s [id=ed5da3c7-1374-4bfc-b341-605cae6b6ed5] 2025-03-23 20:09:42.751877 | orchestrator | 20:09:42.751 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[7]: Creating... 2025-03-23 20:09:42.765979 | orchestrator | 20:09:42.765 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[10]: Creation complete after 11s [id=ed33c331-9d6a-419f-a2ff-c44c346902af] 2025-03-23 20:09:42.774106 | orchestrator | 20:09:42.773 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[11]: Creating... 2025-03-23 20:09:42.777613 | orchestrator | 20:09:42.777 STDOUT terraform: data.openstack_images_image_v2.image_node: Read complete after 0s [id=cd9ae1ce-c4eb-4380-9087-2aa040df6990] 2025-03-23 20:09:42.777730 | orchestrator | 20:09:42.777 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[13]: Creation complete after 10s [id=e582b8ba-25ef-45a1-82f9-427bde16f6b4] 2025-03-23 20:09:42.781918 | orchestrator | 20:09:42.781 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[8]: Creating... 2025-03-23 20:09:42.782621 | orchestrator | 20:09:42.782 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[17]: Creating... 2025-03-23 20:09:42.787524 | orchestrator | 20:09:42.787 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[15]: Still creating... [10s elapsed] 2025-03-23 20:09:42.974330 | orchestrator | 20:09:42.973 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[15]: Creation complete after 10s [id=ce372efe-ce53-40a7-9ab4-8764278391af] 2025-03-23 20:09:42.987244 | orchestrator | 20:09:42.987 STDOUT terraform: openstack_blockstorage_volume_v3.manager_base_volume[0]: Creating... 2025-03-23 20:09:48.014208 | orchestrator | 20:09:48.013 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[4]: Still creating... [10s elapsed] 2025-03-23 20:09:48.182195 | orchestrator | 20:09:48.181 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[4]: Creation complete after 10s [id=50c8fc43-f5ea-4ff6-9b1c-3a82fc0b92db] 2025-03-23 20:09:48.202104 | orchestrator | 20:09:48.201 STDOUT terraform: local_file.id_rsa_pub: Creating... 2025-03-23 20:09:48.207487 | orchestrator | 20:09:48.207 STDOUT terraform: local_file.id_rsa_pub: Creation complete after 0s [id=6403bec7c18433608a6a45710d0b8664269a4098] 2025-03-23 20:09:48.215997 | orchestrator | 20:09:48.215 STDOUT terraform: local_sensitive_file.id_rsa: Creating... 2025-03-23 20:09:48.222972 | orchestrator | 20:09:48.222 STDOUT terraform: local_sensitive_file.id_rsa: Creation complete after 0s [id=8a2fb62564e3c5ed29cdd5919e8fd44e45ff95c2] 2025-03-23 20:09:48.230774 | orchestrator | 20:09:48.230 STDOUT terraform: openstack_networking_subnet_v2.subnet_management: Creating... 2025-03-23 20:09:52.701130 | orchestrator | 20:09:52.700 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[3]: Still creating... [10s elapsed] 2025-03-23 20:09:52.706243 | orchestrator | 20:09:52.705 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[0]: Still creating... [10s elapsed] 2025-03-23 20:09:52.723423 | orchestrator | 20:09:52.723 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[6]: Still creating... [10s elapsed] 2025-03-23 20:09:52.729804 | orchestrator | 20:09:52.729 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[12]: Still creating... [10s elapsed] 2025-03-23 20:09:52.753057 | orchestrator | 20:09:52.752 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[7]: Still creating... [10s elapsed] 2025-03-23 20:09:52.774487 | orchestrator | 20:09:52.774 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[11]: Still creating... [10s elapsed] 2025-03-23 20:09:52.782669 | orchestrator | 20:09:52.782 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[8]: Still creating... [10s elapsed] 2025-03-23 20:09:52.783730 | orchestrator | 20:09:52.783 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[17]: Still creating... [10s elapsed] 2025-03-23 20:09:52.889674 | orchestrator | 20:09:52.889 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[0]: Creation complete after 10s [id=67c0bee0-d01a-40a6-b42a-87f59926aeb7] 2025-03-23 20:09:52.904551 | orchestrator | 20:09:52.904 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[1]: Creating... 2025-03-23 20:09:52.916261 | orchestrator | 20:09:52.915 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[3]: Creation complete after 10s [id=456a620e-4d8a-4da0-8fad-68a9dae98a07] 2025-03-23 20:09:52.924954 | orchestrator | 20:09:52.924 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[5]: Creating... 2025-03-23 20:09:52.940211 | orchestrator | 20:09:52.939 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[6]: Creation complete after 10s [id=cf3f799f-1003-4b60-bca6-f0cfc6b51825] 2025-03-23 20:09:52.949323 | orchestrator | 20:09:52.949 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[0]: Creating... 2025-03-23 20:09:52.954271 | orchestrator | 20:09:52.953 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[12]: Creation complete after 10s [id=d3439608-91a7-4b46-9305-8cd72b648eb4] 2025-03-23 20:09:52.959805 | orchestrator | 20:09:52.959 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[4]: Creating... 2025-03-23 20:09:52.983404 | orchestrator | 20:09:52.983 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[7]: Creation complete after 10s [id=6c9a484d-d13d-480e-b0c2-a5d296d667b7] 2025-03-23 20:09:52.987958 | orchestrator | 20:09:52.987 STDOUT terraform: openstack_blockstorage_volume_v3.manager_base_volume[0]: Still creating... [10s elapsed] 2025-03-23 20:09:52.988219 | orchestrator | 20:09:52.988 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[3]: Creating... 2025-03-23 20:09:53.003539 | orchestrator | 20:09:53.002 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[11]: Creation complete after 10s [id=558eae64-1e92-4eba-9b7a-c9f2592aca3c] 2025-03-23 20:09:53.008019 | orchestrator | 20:09:53.007 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[2]: Creating... 2025-03-23 20:09:53.021689 | orchestrator | 20:09:53.021 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[8]: Creation complete after 10s [id=5da7aba9-9f2d-45cd-9daa-a6f3147ca79c] 2025-03-23 20:09:53.041038 | orchestrator | 20:09:53.040 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[17]: Creation complete after 10s [id=4ae64758-af7a-4ee3-835e-8ab2b9979c52] 2025-03-23 20:09:53.325009 | orchestrator | 20:09:53.324 STDOUT terraform: openstack_blockstorage_volume_v3.manager_base_volume[0]: Creation complete after 10s [id=fefced78-cb99-42c9-887b-5fe6ec071bfd] 2025-03-23 20:09:53.904887 | orchestrator | 20:09:53.904 STDOUT terraform: openstack_networking_subnet_v2.subnet_management: Creation complete after 6s [id=5462e082-06f6-4a6b-b0a2-738477585053] 2025-03-23 20:09:53.912803 | orchestrator | 20:09:53.912 STDOUT terraform: openstack_networking_router_v2.router: Creating... 2025-03-23 20:10:01.309703 | orchestrator | 20:10:01.309 STDOUT terraform: openstack_networking_router_v2.router: Creation complete after 7s [id=c0628f0a-5401-4500-8eb3-19645794787e] 2025-03-23 20:10:01.315414 | orchestrator | 20:10:01.315 STDOUT terraform: openstack_networking_router_interface_v2.router_interface: Creating... 2025-03-23 20:10:01.318179 | orchestrator | 20:10:01.317 STDOUT terraform: openstack_networking_secgroup_v2.security_group_management: Creating... 2025-03-23 20:10:01.318944 | orchestrator | 20:10:01.318 STDOUT terraform: openstack_networking_secgroup_v2.security_group_node: Creating... 2025-03-23 20:10:01.456340 | orchestrator | 20:10:01.455 STDOUT terraform: openstack_networking_secgroup_v2.security_group_node: Creation complete after 0s [id=78aa9c60-b846-4088-a7e1-85775093f286] 2025-03-23 20:10:01.468421 | orchestrator | 20:10:01.468 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creating... 2025-03-23 20:10:01.472801 | orchestrator | 20:10:01.472 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creating... 2025-03-23 20:10:01.508331 | orchestrator | 20:10:01.508 STDOUT terraform: openstack_networking_secgroup_v2.security_group_management: Creation complete after 1s [id=40e02c7d-dfa4-4096-a19d-f2f5aeeeaf9b] 2025-03-23 20:10:01.520546 | orchestrator | 20:10:01.520 STDOUT terraform: openstack_networking_port_v2.node_port_management[5]: Creating... 2025-03-23 20:10:01.575083 | orchestrator | 20:10:01.574 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creation complete after 1s [id=a1c2d5c6-0bee-4048-a672-cfea68168ebd] 2025-03-23 20:10:01.585048 | orchestrator | 20:10:01.584 STDOUT terraform: openstack_networking_port_v2.node_port_management[4]: Creating... 2025-03-23 20:10:01.674132 | orchestrator | 20:10:01.673 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creation complete after 1s [id=ac167388-4a26-4a3b-b0f6-01cb79e576ae] 2025-03-23 20:10:01.688966 | orchestrator | 20:10:01.688 STDOUT terraform: openstack_networking_port_v2.node_port_management[0]: Creating... 2025-03-23 20:10:02.905679 | orchestrator | 20:10:02.905 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[1]: Still creating... [10s elapsed] 2025-03-23 20:10:02.926425 | orchestrator | 20:10:02.926 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[5]: Still creating... [10s elapsed] 2025-03-23 20:10:02.949797 | orchestrator | 20:10:02.949 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[0]: Still creating... [10s elapsed] 2025-03-23 20:10:02.961000 | orchestrator | 20:10:02.960 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[4]: Still creating... [10s elapsed] 2025-03-23 20:10:02.989707 | orchestrator | 20:10:02.989 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[3]: Still creating... [10s elapsed] 2025-03-23 20:10:03.009091 | orchestrator | 20:10:03.008 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[2]: Still creating... [10s elapsed] 2025-03-23 20:10:03.301231 | orchestrator | 20:10:03.300 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[1]: Creation complete after 10s [id=b962c656-6d3d-42c0-b306-ec5071fd7293] 2025-03-23 20:10:03.319515 | orchestrator | 20:10:03.319 STDOUT terraform: openstack_networking_port_v2.node_port_management[2]: Creating... 2025-03-23 20:10:03.322435 | orchestrator | 20:10:03.322 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[5]: Creation complete after 10s [id=2c3245a0-bb3b-48a6-962d-1b5b9b49262d] 2025-03-23 20:10:03.326228 | orchestrator | 20:10:03.326 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creating... 2025-03-23 20:10:03.348048 | orchestrator | 20:10:03.347 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[4]: Creation complete after 10s [id=4f3230c7-420c-46b1-a455-5c4b6f068dba] 2025-03-23 20:10:03.361158 | orchestrator | 20:10:03.360 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[0]: Creation complete after 10s [id=5649f73d-2968-4ac7-8b60-0690f5ee0e4e] 2025-03-23 20:10:03.361520 | orchestrator | 20:10:03.361 STDOUT terraform: openstack_networking_port_v2.node_port_management[3]: Creating... 2025-03-23 20:10:03.365199 | orchestrator | 20:10:03.365 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[3]: Creation complete after 10s [id=dcc46cc2-9048-4c81-bc2f-465e491970df] 2025-03-23 20:10:03.371120 | orchestrator | 20:10:03.370 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creating... 2025-03-23 20:10:03.373281 | orchestrator | 20:10:03.373 STDOUT terraform: openstack_networking_port_v2.node_port_management[1]: Creating... 2025-03-23 20:10:03.429057 | orchestrator | 20:10:03.428 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[2]: Creation complete after 10s [id=06e70b21-3e45-436c-9069-d19a0fc66a54] 2025-03-23 20:10:03.437762 | orchestrator | 20:10:03.437 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creation complete after 0s [id=3cb59a97-65fe-4c39-b833-f11962a57368] 2025-03-23 20:10:03.444636 | orchestrator | 20:10:03.444 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creating... 2025-03-23 20:10:03.449924 | orchestrator | 20:10:03.449 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creating... 2025-03-23 20:10:03.559369 | orchestrator | 20:10:03.559 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creation complete after 1s [id=fe613d6e-f4e4-41ba-82b9-48ff06a02d4a] 2025-03-23 20:10:03.566509 | orchestrator | 20:10:03.566 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creating... 2025-03-23 20:10:04.403211 | orchestrator | 20:10:04.402 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creation complete after 1s [id=4141b803-d74c-4a7b-8ab1-e8d36cb3af8c] 2025-03-23 20:10:04.410913 | orchestrator | 20:10:04.410 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creating... 2025-03-23 20:10:04.515650 | orchestrator | 20:10:04.515 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creation complete after 2s [id=8106a58f-fb32-4f7c-9da1-dbf8bac13648] 2025-03-23 20:10:04.521146 | orchestrator | 20:10:04.520 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creating... 2025-03-23 20:10:04.624204 | orchestrator | 20:10:04.623 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creation complete after 1s [id=958cae54-1100-4885-b5a6-420a45763c84] 2025-03-23 20:10:04.637708 | orchestrator | 20:10:04.637 STDOUT terraform: openstack_networking_port_v2.manager_port_management: Creating... 2025-03-23 20:10:04.731065 | orchestrator | 20:10:04.730 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creation complete after 1s [id=084ba02e-47af-463c-9539-6cabc870a880] 2025-03-23 20:10:04.830617 | orchestrator | 20:10:04.830 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creation complete after 0s [id=3bf7df80-d092-4cd9-885b-ac7891cdcd4a] 2025-03-23 20:10:07.297875 | orchestrator | 20:10:07.297 STDOUT terraform: openstack_networking_port_v2.node_port_management[5]: Creation complete after 5s [id=9cbb8f06-6f3c-4982-89ba-6c1f49243bb7] 2025-03-23 20:10:07.369857 | orchestrator | 20:10:07.369 STDOUT terraform: openstack_networking_port_v2.node_port_management[4]: Creation complete after 5s [id=41c27fe4-dee6-403b-9d84-7c9a1140fe25] 2025-03-23 20:10:07.705847 | orchestrator | 20:10:07.705 STDOUT terraform: openstack_networking_port_v2.node_port_management[0]: Creation complete after 6s [id=0342b2ff-3137-484e-b076-3972428420ea] 2025-03-23 20:10:08.025306 | orchestrator | 20:10:08.024 STDOUT terraform: openstack_networking_router_interface_v2.router_interface: Creation complete after 7s [id=20698692-beae-4d7d-aa1a-ff722d7bec21] 2025-03-23 20:10:08.032616 | orchestrator | 20:10:08.032 STDOUT terraform: openstack_networking_floatingip_v2.manager_floating_ip: Creating... 2025-03-23 20:10:08.757188 | orchestrator | 20:10:08.756 STDOUT terraform: openstack_networking_port_v2.node_port_management[2]: Creation complete after 6s [id=7abb6e45-c7c7-4ee1-b3e9-af7be1538613] 2025-03-23 20:10:09.650864 | orchestrator | 20:10:09.650 STDOUT terraform: openstack_networking_port_v2.node_port_management[1]: Creation complete after 7s [id=e61436fa-3f64-4b99-a28e-47af35fdbc09] 2025-03-23 20:10:09.779272 | orchestrator | 20:10:09.778 STDOUT terraform: openstack_networking_port_v2.node_port_management[3]: Creation complete after 7s [id=bde1bfc6-e8bf-4a2c-a7f6-d75d46467f8b] 2025-03-23 20:10:09.806163 | orchestrator | 20:10:09.805 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Creating... 2025-03-23 20:10:09.816375 | orchestrator | 20:10:09.816 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Creating... 2025-03-23 20:10:09.817605 | orchestrator | 20:10:09.817 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Creating... 2025-03-23 20:10:09.825030 | orchestrator | 20:10:09.822 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Creating... 2025-03-23 20:10:09.827422 | orchestrator | 20:10:09.822 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Creating... 2025-03-23 20:10:09.827503 | orchestrator | 20:10:09.827 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Creating... 2025-03-23 20:10:10.091322 | orchestrator | 20:10:10.090 STDOUT terraform: openstack_networking_port_v2.manager_port_management: Creation complete after 5s [id=917c662b-ba88-4d93-8d2d-3b4e084a7faa] 2025-03-23 20:10:14.198606 | orchestrator | 20:10:14.198 STDOUT terraform: openstack_networking_floatingip_v2.manager_floating_ip: Creation complete after 6s [id=47a23ef3-7b2b-4970-bf9c-5cd9c0a77ea6] 2025-03-23 20:10:14.214823 | orchestrator | 20:10:14.214 STDOUT terraform: openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creating... 2025-03-23 20:10:14.224162 | orchestrator | 20:10:14.224 STDOUT terraform: local_file.inventory: Creating... 2025-03-23 20:10:14.224774 | orchestrator | 20:10:14.224 STDOUT terraform: local_file.MANAGER_ADDRESS: Creating... 2025-03-23 20:10:14.228890 | orchestrator | 20:10:14.228 STDOUT terraform: local_file.inventory: Creation complete after 0s [id=aaed80ef2b5d5688d0f0858898ad4a667438f933] 2025-03-23 20:10:14.232277 | orchestrator | 20:10:14.232 STDOUT terraform: local_file.MANAGER_ADDRESS: Creation complete after 0s [id=9730358b115aaa95423789dd0812b9cedda0ce93] 2025-03-23 20:10:14.714187 | orchestrator | 20:10:14.713 STDOUT terraform: openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creation complete after 1s [id=47a23ef3-7b2b-4970-bf9c-5cd9c0a77ea6] 2025-03-23 20:10:19.808093 | orchestrator | 20:10:19.807 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [10s elapsed] 2025-03-23 20:10:19.817042 | orchestrator | 20:10:19.816 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Still creating... [10s elapsed] 2025-03-23 20:10:19.817977 | orchestrator | 20:10:19.817 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Still creating... [10s elapsed] 2025-03-23 20:10:19.823236 | orchestrator | 20:10:19.823 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [10s elapsed] 2025-03-23 20:10:19.825388 | orchestrator | 20:10:19.825 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [10s elapsed] 2025-03-23 20:10:19.828625 | orchestrator | 20:10:19.828 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Still creating... [10s elapsed] 2025-03-23 20:10:29.808512 | orchestrator | 20:10:29.808 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [20s elapsed] 2025-03-23 20:10:29.817145 | orchestrator | 20:10:29.816 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Still creating... [20s elapsed] 2025-03-23 20:10:29.818224 | orchestrator | 20:10:29.818 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Still creating... [20s elapsed] 2025-03-23 20:10:29.824415 | orchestrator | 20:10:29.824 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [20s elapsed] 2025-03-23 20:10:29.825565 | orchestrator | 20:10:29.825 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [20s elapsed] 2025-03-23 20:10:29.829807 | orchestrator | 20:10:29.829 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Still creating... [20s elapsed] 2025-03-23 20:10:30.130291 | orchestrator | 20:10:30.129 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Creation complete after 20s [id=d3008f02-d822-4b5d-80d6-ffadd5c3da90] 2025-03-23 20:10:30.194452 | orchestrator | 20:10:30.194 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Creation complete after 20s [id=f7dd0949-9f90-43c1-ba19-8557b9c871d0] 2025-03-23 20:10:39.817481 | orchestrator | 20:10:39.817 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Still creating... [30s elapsed] 2025-03-23 20:10:39.818528 | orchestrator | 20:10:39.818 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Still creating... [30s elapsed] 2025-03-23 20:10:39.824572 | orchestrator | 20:10:39.824 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [30s elapsed] 2025-03-23 20:10:39.825786 | orchestrator | 20:10:39.825 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [30s elapsed] 2025-03-23 20:10:40.329418 | orchestrator | 20:10:40.329 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Creation complete after 30s [id=dd80c1de-2565-4bbd-947c-aebbfe4165fe] 2025-03-23 20:10:40.418991 | orchestrator | 20:10:40.418 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Creation complete after 30s [id=b89d3d8b-9a42-4e19-9420-9c8123f02d92] 2025-03-23 20:10:40.475600 | orchestrator | 20:10:40.475 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Creation complete after 30s [id=574f12c9-a455-477c-8d9a-5e8ecf9c9dc9] 2025-03-23 20:10:40.546600 | orchestrator | 20:10:40.546 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Creation complete after 31s [id=52dde698-1f4d-4b83-a215-3c25513ba7f7] 2025-03-23 20:10:40.575035 | orchestrator | 20:10:40.574 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creating... 2025-03-23 20:10:40.576217 | orchestrator | 20:10:40.576 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[16]: Creating... 2025-03-23 20:10:40.578375 | orchestrator | 20:10:40.578 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[15]: Creating... 2025-03-23 20:10:40.579859 | orchestrator | 20:10:40.579 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[11]: Creating... 2025-03-23 20:10:40.580814 | orchestrator | 20:10:40.580 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creating... 2025-03-23 20:10:40.591155 | orchestrator | 20:10:40.591 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creating... 2025-03-23 20:10:40.592281 | orchestrator | 20:10:40.592 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creating... 2025-03-23 20:10:40.592350 | orchestrator | 20:10:40.592 STDOUT terraform: null_resource.node_semaphore: Creating... 2025-03-23 20:10:40.595449 | orchestrator | 20:10:40.595 STDOUT terraform: null_resource.node_semaphore: Creation complete after 0s [id=6445320592382265343] 2025-03-23 20:10:40.603008 | orchestrator | 20:10:40.602 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creating... 2025-03-23 20:10:40.603508 | orchestrator | 20:10:40.603 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creating... 2025-03-23 20:10:40.612246 | orchestrator | 20:10:40.612 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creating... 2025-03-23 20:10:45.918694 | orchestrator | 20:10:45.918 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[16]: Creation complete after 5s [id=b89d3d8b-9a42-4e19-9420-9c8123f02d92/f00ea6e7-d866-4241-9ba8-a6f4135a6384] 2025-03-23 20:10:45.932352 | orchestrator | 20:10:45.932 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[13]: Creating... 2025-03-23 20:10:45.953178 | orchestrator | 20:10:45.952 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creation complete after 5s [id=d3008f02-d822-4b5d-80d6-ffadd5c3da90/456a620e-4d8a-4da0-8fad-68a9dae98a07] 2025-03-23 20:10:45.962390 | orchestrator | 20:10:45.962 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[10]: Creating... 2025-03-23 20:10:45.988944 | orchestrator | 20:10:45.988 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creation complete after 5s [id=dd80c1de-2565-4bbd-947c-aebbfe4165fe/5da7aba9-9f2d-45cd-9daa-a6f3147ca79c] 2025-03-23 20:10:46.000057 | orchestrator | 20:10:45.999 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creation complete after 5s [id=b89d3d8b-9a42-4e19-9420-9c8123f02d92/50c8fc43-f5ea-4ff6-9b1c-3a82fc0b92db] 2025-03-23 20:10:46.003733 | orchestrator | 20:10:46.003 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[17]: Creating... 2025-03-23 20:10:46.012872 | orchestrator | 20:10:46.012 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[9]: Creating... 2025-03-23 20:10:46.022477 | orchestrator | 20:10:46.022 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[15]: Creation complete after 5s [id=d3008f02-d822-4b5d-80d6-ffadd5c3da90/ce372efe-ce53-40a7-9ab4-8764278391af] 2025-03-23 20:10:46.034774 | orchestrator | 20:10:46.034 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[12]: Creating... 2025-03-23 20:10:46.037671 | orchestrator | 20:10:46.037 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[11]: Creation complete after 5s [id=f7dd0949-9f90-43c1-ba19-8557b9c871d0/558eae64-1e92-4eba-9b7a-c9f2592aca3c] 2025-03-23 20:10:46.050972 | orchestrator | 20:10:46.050 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creating... 2025-03-23 20:10:46.063334 | orchestrator | 20:10:46.063 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creation complete after 5s [id=574f12c9-a455-477c-8d9a-5e8ecf9c9dc9/cf3f799f-1003-4b60-bca6-f0cfc6b51825] 2025-03-23 20:10:46.065029 | orchestrator | 20:10:46.064 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creation complete after 5s [id=52dde698-1f4d-4b83-a215-3c25513ba7f7/6c9a484d-d13d-480e-b0c2-a5d296d667b7] 2025-03-23 20:10:46.076335 | orchestrator | 20:10:46.076 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creating... 2025-03-23 20:10:46.076853 | orchestrator | 20:10:46.076 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[14]: Creating... 2025-03-23 20:10:46.090171 | orchestrator | 20:10:46.089 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creation complete after 5s [id=52dde698-1f4d-4b83-a215-3c25513ba7f7/ec0eaa49-f31e-4f34-9bf5-00a65ad435ca] 2025-03-23 20:10:46.106899 | orchestrator | 20:10:46.106 STDOUT terraform: openstack_compute_instance_v2.manager_server: Creating... 2025-03-23 20:10:46.126724 | orchestrator | 20:10:46.126 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creation complete after 5s [id=574f12c9-a455-477c-8d9a-5e8ecf9c9dc9/67c0bee0-d01a-40a6-b42a-87f59926aeb7] 2025-03-23 20:10:51.320803 | orchestrator | 20:10:51.320 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[13]: Creation complete after 5s [id=52dde698-1f4d-4b83-a215-3c25513ba7f7/e582b8ba-25ef-45a1-82f9-427bde16f6b4] 2025-03-23 20:10:51.348823 | orchestrator | 20:10:51.348 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[10]: Creation complete after 5s [id=b89d3d8b-9a42-4e19-9420-9c8123f02d92/ed33c331-9d6a-419f-a2ff-c44c346902af] 2025-03-23 20:10:51.481435 | orchestrator | 20:10:51.481 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[17]: Creation complete after 5s [id=f7dd0949-9f90-43c1-ba19-8557b9c871d0/4ae64758-af7a-4ee3-835e-8ab2b9979c52] 2025-03-23 20:10:51.514922 | orchestrator | 20:10:51.514 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[12]: Creation complete after 6s [id=574f12c9-a455-477c-8d9a-5e8ecf9c9dc9/d3439608-91a7-4b46-9305-8cd72b648eb4] 2025-03-23 20:10:51.535807 | orchestrator | 20:10:51.535 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[9]: Creation complete after 6s [id=d3008f02-d822-4b5d-80d6-ffadd5c3da90/ed5da3c7-1374-4bfc-b341-605cae6b6ed5] 2025-03-23 20:10:51.565716 | orchestrator | 20:10:51.565 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creation complete after 6s [id=f7dd0949-9f90-43c1-ba19-8557b9c871d0/1aae855b-0878-455b-beee-c51e17b854da] 2025-03-23 20:10:51.581491 | orchestrator | 20:10:51.581 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creation complete after 6s [id=dd80c1de-2565-4bbd-947c-aebbfe4165fe/81262b7a-e68a-468c-9a1b-ebbf4a5ddecf] 2025-03-23 20:10:51.668331 | orchestrator | 20:10:51.668 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[14]: Creation complete after 6s [id=dd80c1de-2565-4bbd-947c-aebbfe4165fe/1e600cae-effb-4d1d-924b-3d561c8b4c34] 2025-03-23 20:10:56.108571 | orchestrator | 20:10:56.108 STDOUT terraform: openstack_compute_instance_v2.manager_server: Still creating... [10s elapsed] 2025-03-23 20:11:06.109049 | orchestrator | 20:11:06.108 STDOUT terraform: openstack_compute_instance_v2.manager_server: Still creating... [20s elapsed] 2025-03-23 20:11:07.237362 | orchestrator | 20:11:07.237 STDOUT terraform: openstack_compute_instance_v2.manager_server: Creation complete after 21s [id=be068504-443a-470a-b44b-78dd6708adce] 2025-03-23 20:11:07.275526 | orchestrator | 20:11:07.275 STDOUT terraform: Apply complete! Resources: 82 added, 0 changed, 0 destroyed. 2025-03-23 20:11:07.284367 | orchestrator | 20:11:07.275 STDOUT terraform: Outputs: 2025-03-23 20:11:07.284449 | orchestrator | 20:11:07.275 STDOUT terraform: manager_address = 2025-03-23 20:11:07.284514 | orchestrator | 20:11:07.275 STDOUT terraform: private_key = 2025-03-23 20:11:17.373126 | orchestrator | changed 2025-03-23 20:11:17.400147 | 2025-03-23 20:11:17.400261 | TASK [Fetch manager address] 2025-03-23 20:11:17.744154 | orchestrator | ok 2025-03-23 20:11:17.753817 | 2025-03-23 20:11:17.753927 | TASK [Set manager_host address] 2025-03-23 20:11:17.864467 | orchestrator | ok 2025-03-23 20:11:17.874727 | 2025-03-23 20:11:17.874837 | LOOP [Update ansible collections] 2025-03-23 20:11:18.587567 | orchestrator | changed 2025-03-23 20:11:19.281813 | orchestrator | changed 2025-03-23 20:11:19.299605 | 2025-03-23 20:11:19.299850 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2025-03-23 20:11:29.857851 | orchestrator | ok 2025-03-23 20:11:29.867772 | 2025-03-23 20:11:29.867875 | TASK [Wait a little longer for the manager so that everything is ready] 2025-03-23 20:12:29.909994 | orchestrator | ok 2025-03-23 20:12:29.920785 | 2025-03-23 20:12:29.920890 | TASK [Fetch manager ssh hostkey] 2025-03-23 20:12:30.994713 | orchestrator | Output suppressed because no_log was given 2025-03-23 20:12:31.007894 | 2025-03-23 20:12:31.008036 | TASK [Get ssh keypair from terraform environment] 2025-03-23 20:12:31.553176 | orchestrator | changed 2025-03-23 20:12:31.571358 | 2025-03-23 20:12:31.571506 | TASK [Point out that the following task takes some time and does not give any output] 2025-03-23 20:12:31.610208 | orchestrator | ok: The task 'Run manager part 0' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minutes for this task to complete. 2025-03-23 20:12:31.618611 | 2025-03-23 20:12:31.618721 | TASK [Run manager part 0] 2025-03-23 20:12:32.432634 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-03-23 20:12:32.474742 | orchestrator | 2025-03-23 20:12:34.599287 | orchestrator | PLAY [Wait for cloud-init to finish] ******************************************* 2025-03-23 20:12:34.599345 | orchestrator | 2025-03-23 20:12:34.599369 | orchestrator | TASK [Check /var/lib/cloud/instance/boot-finished] ***************************** 2025-03-23 20:12:34.599385 | orchestrator | ok: [testbed-manager] 2025-03-23 20:12:36.650186 | orchestrator | 2025-03-23 20:12:36.650252 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2025-03-23 20:12:36.650263 | orchestrator | 2025-03-23 20:12:36.650269 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-23 20:12:36.650282 | orchestrator | ok: [testbed-manager] 2025-03-23 20:12:37.335593 | orchestrator | 2025-03-23 20:12:37.335652 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2025-03-23 20:12:37.335669 | orchestrator | ok: [testbed-manager] 2025-03-23 20:12:37.384285 | orchestrator | 2025-03-23 20:12:37.384350 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2025-03-23 20:12:37.384371 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:12:37.412261 | orchestrator | 2025-03-23 20:12:37.412287 | orchestrator | TASK [Update package cache] **************************************************** 2025-03-23 20:12:37.412299 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:12:37.437798 | orchestrator | 2025-03-23 20:12:37.437828 | orchestrator | TASK [Install required packages] *********************************************** 2025-03-23 20:12:37.437841 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:12:37.461466 | orchestrator | 2025-03-23 20:12:37.461488 | orchestrator | TASK [Remove some python packages] ********************************************* 2025-03-23 20:12:37.461497 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:12:37.494758 | orchestrator | 2025-03-23 20:12:37.494782 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2025-03-23 20:12:37.494792 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:12:37.528273 | orchestrator | 2025-03-23 20:12:37.528299 | orchestrator | TASK [Fail if Ubuntu version is lower than 22.04] ****************************** 2025-03-23 20:12:37.528311 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:12:37.558693 | orchestrator | 2025-03-23 20:12:37.558713 | orchestrator | TASK [Fail if Debian version is lower than 12] ********************************* 2025-03-23 20:12:37.558724 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:12:38.489068 | orchestrator | 2025-03-23 20:12:38.489124 | orchestrator | TASK [Set APT options on manager] ********************************************** 2025-03-23 20:12:38.489141 | orchestrator | changed: [testbed-manager] 2025-03-23 20:15:35.657703 | orchestrator | 2025-03-23 20:15:35.657798 | orchestrator | TASK [Update APT cache and run dist-upgrade] *********************************** 2025-03-23 20:15:35.657837 | orchestrator | changed: [testbed-manager] 2025-03-23 20:16:56.100719 | orchestrator | 2025-03-23 20:16:56.100945 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2025-03-23 20:16:56.100987 | orchestrator | changed: [testbed-manager] 2025-03-23 20:17:22.110785 | orchestrator | 2025-03-23 20:17:22.110890 | orchestrator | TASK [Install required packages] *********************************************** 2025-03-23 20:17:22.110921 | orchestrator | changed: [testbed-manager] 2025-03-23 20:17:32.292022 | orchestrator | 2025-03-23 20:17:32.292152 | orchestrator | TASK [Remove some python packages] ********************************************* 2025-03-23 20:17:32.292189 | orchestrator | changed: [testbed-manager] 2025-03-23 20:17:32.343201 | orchestrator | 2025-03-23 20:17:32.343257 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2025-03-23 20:17:32.343283 | orchestrator | ok: [testbed-manager] 2025-03-23 20:17:33.183349 | orchestrator | 2025-03-23 20:17:33.183483 | orchestrator | TASK [Get current user] ******************************************************** 2025-03-23 20:17:33.183520 | orchestrator | ok: [testbed-manager] 2025-03-23 20:17:33.935915 | orchestrator | 2025-03-23 20:17:33.936020 | orchestrator | TASK [Create venv directory] *************************************************** 2025-03-23 20:17:33.936063 | orchestrator | changed: [testbed-manager] 2025-03-23 20:17:41.633962 | orchestrator | 2025-03-23 20:17:41.634100 | orchestrator | TASK [Install netaddr in venv] ************************************************* 2025-03-23 20:17:41.634139 | orchestrator | changed: [testbed-manager] 2025-03-23 20:17:48.879196 | orchestrator | 2025-03-23 20:17:48.879328 | orchestrator | TASK [Install ansible-core in venv] ******************************************** 2025-03-23 20:17:48.879386 | orchestrator | changed: [testbed-manager] 2025-03-23 20:17:51.820134 | orchestrator | 2025-03-23 20:17:51.821060 | orchestrator | TASK [Install requests >= 2.32.2] ********************************************** 2025-03-23 20:17:51.821121 | orchestrator | changed: [testbed-manager] 2025-03-23 20:17:53.853326 | orchestrator | 2025-03-23 20:17:53.853431 | orchestrator | TASK [Install docker >= 7.1.0] ************************************************* 2025-03-23 20:17:53.853491 | orchestrator | changed: [testbed-manager] 2025-03-23 20:17:55.027905 | orchestrator | 2025-03-23 20:17:55.028036 | orchestrator | TASK [Create directories in /opt/src] ****************************************** 2025-03-23 20:17:55.028081 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2025-03-23 20:17:55.072880 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2025-03-23 20:17:55.072932 | orchestrator | 2025-03-23 20:17:55.072941 | orchestrator | TASK [Sync sources in /opt/src] ************************************************ 2025-03-23 20:17:55.072955 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2025-03-23 20:17:58.254186 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2025-03-23 20:17:58.254288 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2025-03-23 20:17:58.254306 | orchestrator | deprecation_warnings=False in ansible.cfg. 2025-03-23 20:17:58.254335 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2025-03-23 20:17:58.878178 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2025-03-23 20:17:58.878279 | orchestrator | 2025-03-23 20:17:58.878297 | orchestrator | TASK [Create /usr/share/ansible directory] ************************************* 2025-03-23 20:17:58.878326 | orchestrator | changed: [testbed-manager] 2025-03-23 20:18:24.254351 | orchestrator | 2025-03-23 20:18:24.254404 | orchestrator | TASK [Install collections from Ansible galaxy] ********************************* 2025-03-23 20:18:24.254572 | orchestrator | changed: [testbed-manager] => (item=ansible.netcommon) 2025-03-23 20:18:26.910247 | orchestrator | changed: [testbed-manager] => (item=ansible.posix) 2025-03-23 20:18:26.910349 | orchestrator | changed: [testbed-manager] => (item=community.docker>=3.10.2) 2025-03-23 20:18:26.910368 | orchestrator | 2025-03-23 20:18:26.910386 | orchestrator | TASK [Install local collections] *********************************************** 2025-03-23 20:18:26.910418 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-commons) 2025-03-23 20:18:28.343919 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-services) 2025-03-23 20:18:28.344010 | orchestrator | 2025-03-23 20:18:28.344027 | orchestrator | PLAY [Create operator user] **************************************************** 2025-03-23 20:18:28.344040 | orchestrator | 2025-03-23 20:18:28.344053 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-23 20:18:28.344080 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:28.393625 | orchestrator | 2025-03-23 20:18:28.393713 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2025-03-23 20:18:28.393746 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:28.455501 | orchestrator | 2025-03-23 20:18:28.455573 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2025-03-23 20:18:28.455603 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:29.212582 | orchestrator | 2025-03-23 20:18:29.212678 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2025-03-23 20:18:29.212709 | orchestrator | changed: [testbed-manager] 2025-03-23 20:18:29.983227 | orchestrator | 2025-03-23 20:18:29.983278 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2025-03-23 20:18:29.983295 | orchestrator | changed: [testbed-manager] 2025-03-23 20:18:31.460500 | orchestrator | 2025-03-23 20:18:31.460599 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2025-03-23 20:18:31.460633 | orchestrator | changed: [testbed-manager] => (item=adm) 2025-03-23 20:18:32.964697 | orchestrator | changed: [testbed-manager] => (item=sudo) 2025-03-23 20:18:32.964751 | orchestrator | 2025-03-23 20:18:32.964763 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2025-03-23 20:18:32.964782 | orchestrator | changed: [testbed-manager] 2025-03-23 20:18:34.816501 | orchestrator | 2025-03-23 20:18:34.816543 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2025-03-23 20:18:34.816556 | orchestrator | changed: [testbed-manager] => (item=export LANGUAGE=C.UTF-8) 2025-03-23 20:18:35.425840 | orchestrator | changed: [testbed-manager] => (item=export LANG=C.UTF-8) 2025-03-23 20:18:35.425948 | orchestrator | changed: [testbed-manager] => (item=export LC_ALL=C.UTF-8) 2025-03-23 20:18:35.425969 | orchestrator | 2025-03-23 20:18:35.425986 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2025-03-23 20:18:35.426047 | orchestrator | changed: [testbed-manager] 2025-03-23 20:18:35.497061 | orchestrator | 2025-03-23 20:18:35.497169 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2025-03-23 20:18:35.497205 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:18:36.375747 | orchestrator | 2025-03-23 20:18:36.375809 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2025-03-23 20:18:36.375826 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:18:36.398323 | orchestrator | changed: [testbed-manager] 2025-03-23 20:18:36.398363 | orchestrator | 2025-03-23 20:18:36.398369 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2025-03-23 20:18:36.398381 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:18:36.420857 | orchestrator | 2025-03-23 20:18:36.420892 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2025-03-23 20:18:36.420905 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:18:36.442213 | orchestrator | 2025-03-23 20:18:36.442248 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2025-03-23 20:18:36.442260 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:18:36.493413 | orchestrator | 2025-03-23 20:18:36.493528 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2025-03-23 20:18:36.493549 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:18:37.263938 | orchestrator | 2025-03-23 20:18:37.263980 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2025-03-23 20:18:37.263995 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:38.709454 | orchestrator | 2025-03-23 20:18:38.709530 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2025-03-23 20:18:38.709542 | orchestrator | 2025-03-23 20:18:38.709551 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-23 20:18:38.709572 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:39.814204 | orchestrator | 2025-03-23 20:18:39.814248 | orchestrator | TASK [Recursively change ownership of /opt/venv] ******************************* 2025-03-23 20:18:39.814262 | orchestrator | changed: [testbed-manager] 2025-03-23 20:18:39.914544 | orchestrator | 2025-03-23 20:18:39.914759 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:18:39.914773 | orchestrator | testbed-manager : ok=33 changed=23 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 2025-03-23 20:18:39.914779 | orchestrator | 2025-03-23 20:18:40.372798 | orchestrator | changed 2025-03-23 20:18:40.393720 | 2025-03-23 20:18:40.393855 | TASK [Point out that the log in on the manager is now possible] 2025-03-23 20:18:40.426928 | orchestrator | ok: It is now already possible to log in to the manager with 'make login'. 2025-03-23 20:18:40.435487 | 2025-03-23 20:18:40.435584 | TASK [Point out that the following task takes some time and does not give any output] 2025-03-23 20:18:40.473576 | orchestrator | ok: The task 'Run manager part 1 + 2' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minuts for this task to complete. 2025-03-23 20:18:40.484810 | 2025-03-23 20:18:40.484915 | TASK [Run manager part 1 + 2] 2025-03-23 20:18:41.375898 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-03-23 20:18:41.428952 | orchestrator | 2025-03-23 20:18:44.045039 | orchestrator | PLAY [Run manager part 1] ****************************************************** 2025-03-23 20:18:44.045130 | orchestrator | 2025-03-23 20:18:44.045187 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-23 20:18:44.045248 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:44.078598 | orchestrator | 2025-03-23 20:18:44.078681 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2025-03-23 20:18:44.078718 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:18:44.122768 | orchestrator | 2025-03-23 20:18:44.122824 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2025-03-23 20:18:44.122849 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:44.163708 | orchestrator | 2025-03-23 20:18:44.163748 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-03-23 20:18:44.163762 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:44.230803 | orchestrator | 2025-03-23 20:18:44.230839 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-03-23 20:18:44.230851 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:44.298447 | orchestrator | 2025-03-23 20:18:44.298514 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-03-23 20:18:44.298542 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:44.349609 | orchestrator | 2025-03-23 20:18:44.349647 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-03-23 20:18:44.349663 | orchestrator | included: /home/zuul-testbed02/.ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager 2025-03-23 20:18:45.122127 | orchestrator | 2025-03-23 20:18:45.122176 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-03-23 20:18:45.122193 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:45.166352 | orchestrator | 2025-03-23 20:18:45.166392 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-03-23 20:18:45.166406 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:18:46.521512 | orchestrator | 2025-03-23 20:18:46.521557 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-03-23 20:18:46.521577 | orchestrator | changed: [testbed-manager] 2025-03-23 20:18:47.102333 | orchestrator | 2025-03-23 20:18:47.102409 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-03-23 20:18:47.102458 | orchestrator | ok: [testbed-manager] 2025-03-23 20:18:48.249248 | orchestrator | 2025-03-23 20:18:48.249281 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-03-23 20:18:48.249292 | orchestrator | changed: [testbed-manager] 2025-03-23 20:19:02.161747 | orchestrator | 2025-03-23 20:19:02.161836 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-03-23 20:19:02.161869 | orchestrator | changed: [testbed-manager] 2025-03-23 20:19:02.860180 | orchestrator | 2025-03-23 20:19:02.860275 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2025-03-23 20:19:02.860308 | orchestrator | ok: [testbed-manager] 2025-03-23 20:19:02.911957 | orchestrator | 2025-03-23 20:19:02.912014 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2025-03-23 20:19:02.912040 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:19:03.909213 | orchestrator | 2025-03-23 20:19:03.909324 | orchestrator | TASK [Copy SSH public key] ***************************************************** 2025-03-23 20:19:03.909359 | orchestrator | changed: [testbed-manager] 2025-03-23 20:19:04.931654 | orchestrator | 2025-03-23 20:19:04.931773 | orchestrator | TASK [Copy SSH private key] **************************************************** 2025-03-23 20:19:04.931811 | orchestrator | changed: [testbed-manager] 2025-03-23 20:19:05.581822 | orchestrator | 2025-03-23 20:19:05.581943 | orchestrator | TASK [Create configuration directory] ****************************************** 2025-03-23 20:19:05.581983 | orchestrator | changed: [testbed-manager] 2025-03-23 20:19:05.622518 | orchestrator | 2025-03-23 20:19:05.622580 | orchestrator | TASK [Copy testbed repo] ******************************************************* 2025-03-23 20:19:05.622597 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2025-03-23 20:19:07.989911 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2025-03-23 20:19:07.990212 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2025-03-23 20:19:07.990237 | orchestrator | deprecation_warnings=False in ansible.cfg. 2025-03-23 20:19:07.990269 | orchestrator | changed: [testbed-manager] 2025-03-23 20:19:17.949557 | orchestrator | 2025-03-23 20:19:17.949624 | orchestrator | TASK [Install python requirements in venv] ************************************* 2025-03-23 20:19:17.949644 | orchestrator | ok: [testbed-manager] => (item=Jinja2) 2025-03-23 20:19:19.070293 | orchestrator | ok: [testbed-manager] => (item=PyYAML) 2025-03-23 20:19:19.070361 | orchestrator | ok: [testbed-manager] => (item=packaging) 2025-03-23 20:19:19.070373 | orchestrator | changed: [testbed-manager] => (item=python-gilt==1.2.3) 2025-03-23 20:19:19.070384 | orchestrator | ok: [testbed-manager] => (item=requests>=2.32.2) 2025-03-23 20:19:19.070393 | orchestrator | ok: [testbed-manager] => (item=docker>=7.1.0) 2025-03-23 20:19:19.070403 | orchestrator | 2025-03-23 20:19:19.070412 | orchestrator | TASK [Copy testbed custom CA certificate on Debian/Ubuntu] ********************* 2025-03-23 20:19:19.070496 | orchestrator | changed: [testbed-manager] 2025-03-23 20:19:19.114725 | orchestrator | 2025-03-23 20:19:19.114785 | orchestrator | TASK [Copy testbed custom CA certificate on CentOS] **************************** 2025-03-23 20:19:19.114803 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:19:22.487678 | orchestrator | 2025-03-23 20:19:22.487747 | orchestrator | TASK [Run update-ca-certificates on Debian/Ubuntu] ***************************** 2025-03-23 20:19:22.487767 | orchestrator | changed: [testbed-manager] 2025-03-23 20:19:22.528920 | orchestrator | 2025-03-23 20:19:22.528960 | orchestrator | TASK [Run update-ca-trust on RedHat] ******************************************* 2025-03-23 20:19:22.528977 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:21:06.779915 | orchestrator | 2025-03-23 20:21:06.779974 | orchestrator | TASK [Run manager part 2] ****************************************************** 2025-03-23 20:21:06.779990 | orchestrator | changed: [testbed-manager] 2025-03-23 20:21:08.025087 | orchestrator | 2025-03-23 20:21:08.025137 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-03-23 20:21:08.025152 | orchestrator | ok: [testbed-manager] 2025-03-23 20:21:08.117895 | orchestrator | 2025-03-23 20:21:08.117967 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:21:08.117976 | orchestrator | testbed-manager : ok=21 changed=11 unreachable=0 failed=0 skipped=5 rescued=0 ignored=0 2025-03-23 20:21:08.117982 | orchestrator | 2025-03-23 20:21:08.170180 | orchestrator | changed 2025-03-23 20:21:08.182185 | 2025-03-23 20:21:08.182297 | TASK [Reboot manager] 2025-03-23 20:21:09.739014 | orchestrator | changed 2025-03-23 20:21:09.753216 | 2025-03-23 20:21:09.753344 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2025-03-23 20:21:26.187792 | orchestrator | ok 2025-03-23 20:21:26.198945 | 2025-03-23 20:21:26.199054 | TASK [Wait a little longer for the manager so that everything is ready] 2025-03-23 20:22:26.247434 | orchestrator | ok 2025-03-23 20:22:26.261646 | 2025-03-23 20:22:26.261779 | TASK [Deploy manager + bootstrap nodes] 2025-03-23 20:22:28.896789 | orchestrator | 2025-03-23 20:22:28.899211 | orchestrator | # DEPLOY MANAGER 2025-03-23 20:22:28.899251 | orchestrator | 2025-03-23 20:22:28.899271 | orchestrator | + set -e 2025-03-23 20:22:28.899317 | orchestrator | + echo 2025-03-23 20:22:28.899338 | orchestrator | + echo '# DEPLOY MANAGER' 2025-03-23 20:22:28.899387 | orchestrator | + echo 2025-03-23 20:22:28.899414 | orchestrator | + cat /opt/manager-vars.sh 2025-03-23 20:22:28.899451 | orchestrator | export NUMBER_OF_NODES=6 2025-03-23 20:22:28.900637 | orchestrator | 2025-03-23 20:22:28.900659 | orchestrator | export CEPH_VERSION=quincy 2025-03-23 20:22:28.900675 | orchestrator | export CONFIGURATION_VERSION=main 2025-03-23 20:22:28.900689 | orchestrator | export MANAGER_VERSION=8.1.0 2025-03-23 20:22:28.900703 | orchestrator | export OPENSTACK_VERSION=2024.1 2025-03-23 20:22:28.900717 | orchestrator | 2025-03-23 20:22:28.900733 | orchestrator | export ARA=false 2025-03-23 20:22:28.900748 | orchestrator | export TEMPEST=false 2025-03-23 20:22:28.900762 | orchestrator | export IS_ZUUL=true 2025-03-23 20:22:28.900775 | orchestrator | 2025-03-23 20:22:28.900789 | orchestrator | export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.116 2025-03-23 20:22:28.900808 | orchestrator | export EXTERNAL_API=false 2025-03-23 20:22:28.900825 | orchestrator | 2025-03-23 20:22:28.900840 | orchestrator | export IMAGE_USER=ubuntu 2025-03-23 20:22:28.900854 | orchestrator | export IMAGE_NODE_USER=ubuntu 2025-03-23 20:22:28.900869 | orchestrator | 2025-03-23 20:22:28.900882 | orchestrator | export CEPH_STACK=ceph-ansible 2025-03-23 20:22:28.900896 | orchestrator | 2025-03-23 20:22:28.900910 | orchestrator | + echo 2025-03-23 20:22:28.900924 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-03-23 20:22:28.900945 | orchestrator | ++ export INTERACTIVE=false 2025-03-23 20:22:28.901047 | orchestrator | ++ INTERACTIVE=false 2025-03-23 20:22:28.901065 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-03-23 20:22:28.901087 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-03-23 20:22:28.901101 | orchestrator | + source /opt/manager-vars.sh 2025-03-23 20:22:28.901115 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-03-23 20:22:28.901129 | orchestrator | ++ NUMBER_OF_NODES=6 2025-03-23 20:22:28.901143 | orchestrator | ++ export CEPH_VERSION=quincy 2025-03-23 20:22:28.901156 | orchestrator | ++ CEPH_VERSION=quincy 2025-03-23 20:22:28.901170 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-03-23 20:22:28.901184 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-03-23 20:22:28.901205 | orchestrator | ++ export MANAGER_VERSION=8.1.0 2025-03-23 20:22:28.901219 | orchestrator | ++ MANAGER_VERSION=8.1.0 2025-03-23 20:22:28.901233 | orchestrator | ++ export OPENSTACK_VERSION=2024.1 2025-03-23 20:22:28.901247 | orchestrator | ++ OPENSTACK_VERSION=2024.1 2025-03-23 20:22:28.901261 | orchestrator | ++ export ARA=false 2025-03-23 20:22:28.901289 | orchestrator | ++ ARA=false 2025-03-23 20:22:28.901314 | orchestrator | ++ export TEMPEST=false 2025-03-23 20:22:28.901329 | orchestrator | ++ TEMPEST=false 2025-03-23 20:22:28.901343 | orchestrator | ++ export IS_ZUUL=true 2025-03-23 20:22:28.901410 | orchestrator | ++ IS_ZUUL=true 2025-03-23 20:22:28.901425 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.116 2025-03-23 20:22:28.901439 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.116 2025-03-23 20:22:28.901461 | orchestrator | ++ export EXTERNAL_API=false 2025-03-23 20:22:28.901476 | orchestrator | ++ EXTERNAL_API=false 2025-03-23 20:22:28.901489 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-03-23 20:22:28.901503 | orchestrator | ++ IMAGE_USER=ubuntu 2025-03-23 20:22:28.901517 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-03-23 20:22:28.901531 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-03-23 20:22:28.901548 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-03-23 20:22:28.901567 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-03-23 20:22:28.967130 | orchestrator | + sudo ln -sf /opt/configuration/contrib/semver2.sh /usr/local/bin/semver 2025-03-23 20:22:28.967172 | orchestrator | + docker version 2025-03-23 20:22:29.230321 | orchestrator | Client: Docker Engine - Community 2025-03-23 20:22:29.232565 | orchestrator | Version: 26.1.4 2025-03-23 20:22:29.232596 | orchestrator | API version: 1.45 2025-03-23 20:22:29.232611 | orchestrator | Go version: go1.21.11 2025-03-23 20:22:29.232624 | orchestrator | Git commit: 5650f9b 2025-03-23 20:22:29.232638 | orchestrator | Built: Wed Jun 5 11:28:57 2024 2025-03-23 20:22:29.232653 | orchestrator | OS/Arch: linux/amd64 2025-03-23 20:22:29.232667 | orchestrator | Context: default 2025-03-23 20:22:29.232680 | orchestrator | 2025-03-23 20:22:29.232695 | orchestrator | Server: Docker Engine - Community 2025-03-23 20:22:29.232709 | orchestrator | Engine: 2025-03-23 20:22:29.232723 | orchestrator | Version: 26.1.4 2025-03-23 20:22:29.232736 | orchestrator | API version: 1.45 (minimum version 1.24) 2025-03-23 20:22:29.232750 | orchestrator | Go version: go1.21.11 2025-03-23 20:22:29.232773 | orchestrator | Git commit: de5c9cf 2025-03-23 20:22:29.232821 | orchestrator | Built: Wed Jun 5 11:28:57 2024 2025-03-23 20:22:29.232836 | orchestrator | OS/Arch: linux/amd64 2025-03-23 20:22:29.232850 | orchestrator | Experimental: false 2025-03-23 20:22:29.232864 | orchestrator | containerd: 2025-03-23 20:22:29.232878 | orchestrator | Version: 1.7.25 2025-03-23 20:22:29.232891 | orchestrator | GitCommit: bcc810d6b9066471b0b6fa75f557a15a1cbf31bb 2025-03-23 20:22:29.232906 | orchestrator | runc: 2025-03-23 20:22:29.232919 | orchestrator | Version: 1.2.4 2025-03-23 20:22:29.232933 | orchestrator | GitCommit: v1.2.4-0-g6c52b3f 2025-03-23 20:22:29.232947 | orchestrator | docker-init: 2025-03-23 20:22:29.232961 | orchestrator | Version: 0.19.0 2025-03-23 20:22:29.232975 | orchestrator | GitCommit: de40ad0 2025-03-23 20:22:29.232996 | orchestrator | + sh -c /opt/configuration/scripts/deploy/000-manager.sh 2025-03-23 20:22:29.239504 | orchestrator | + set -e 2025-03-23 20:22:29.244791 | orchestrator | + source /opt/manager-vars.sh 2025-03-23 20:22:29.244824 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-03-23 20:22:29.244839 | orchestrator | ++ NUMBER_OF_NODES=6 2025-03-23 20:22:29.244853 | orchestrator | ++ export CEPH_VERSION=quincy 2025-03-23 20:22:29.244867 | orchestrator | ++ CEPH_VERSION=quincy 2025-03-23 20:22:29.244880 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-03-23 20:22:29.244894 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-03-23 20:22:29.244908 | orchestrator | ++ export MANAGER_VERSION=8.1.0 2025-03-23 20:22:29.244921 | orchestrator | ++ MANAGER_VERSION=8.1.0 2025-03-23 20:22:29.244935 | orchestrator | ++ export OPENSTACK_VERSION=2024.1 2025-03-23 20:22:29.244949 | orchestrator | ++ OPENSTACK_VERSION=2024.1 2025-03-23 20:22:29.244963 | orchestrator | ++ export ARA=false 2025-03-23 20:22:29.244977 | orchestrator | ++ ARA=false 2025-03-23 20:22:29.244990 | orchestrator | ++ export TEMPEST=false 2025-03-23 20:22:29.245004 | orchestrator | ++ TEMPEST=false 2025-03-23 20:22:29.245018 | orchestrator | ++ export IS_ZUUL=true 2025-03-23 20:22:29.245031 | orchestrator | ++ IS_ZUUL=true 2025-03-23 20:22:29.245045 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.116 2025-03-23 20:22:29.245059 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.116 2025-03-23 20:22:29.245072 | orchestrator | ++ export EXTERNAL_API=false 2025-03-23 20:22:29.245086 | orchestrator | ++ EXTERNAL_API=false 2025-03-23 20:22:29.245100 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-03-23 20:22:29.245113 | orchestrator | ++ IMAGE_USER=ubuntu 2025-03-23 20:22:29.245127 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-03-23 20:22:29.245141 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-03-23 20:22:29.245159 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-03-23 20:22:29.245173 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-03-23 20:22:29.245186 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-03-23 20:22:29.245206 | orchestrator | ++ export INTERACTIVE=false 2025-03-23 20:22:29.245219 | orchestrator | ++ INTERACTIVE=false 2025-03-23 20:22:29.245233 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-03-23 20:22:29.245246 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-03-23 20:22:29.245260 | orchestrator | + [[ 8.1.0 != \l\a\t\e\s\t ]] 2025-03-23 20:22:29.245276 | orchestrator | + /opt/configuration/scripts/set-manager-version.sh 8.1.0 2025-03-23 20:22:29.245295 | orchestrator | + set -e 2025-03-23 20:22:29.250395 | orchestrator | + VERSION=8.1.0 2025-03-23 20:22:29.250421 | orchestrator | + sed -i 's/manager_version: .*/manager_version: 8.1.0/g' /opt/configuration/environments/manager/configuration.yml 2025-03-23 20:22:29.250451 | orchestrator | + [[ 8.1.0 != \l\a\t\e\s\t ]] 2025-03-23 20:22:29.253933 | orchestrator | + sed -i /ceph_version:/d /opt/configuration/environments/manager/configuration.yml 2025-03-23 20:22:29.253961 | orchestrator | + sed -i /openstack_version:/d /opt/configuration/environments/manager/configuration.yml 2025-03-23 20:22:29.257879 | orchestrator | + sh -c /opt/configuration/scripts/sync-configuration-repository.sh 2025-03-23 20:22:29.263968 | orchestrator | /opt/configuration ~ 2025-03-23 20:22:29.266953 | orchestrator | + set -e 2025-03-23 20:22:29.266976 | orchestrator | + pushd /opt/configuration 2025-03-23 20:22:29.266991 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-03-23 20:22:29.267011 | orchestrator | + source /opt/venv/bin/activate 2025-03-23 20:22:29.268020 | orchestrator | ++ deactivate nondestructive 2025-03-23 20:22:29.268165 | orchestrator | ++ '[' -n '' ']' 2025-03-23 20:22:29.268185 | orchestrator | ++ '[' -n '' ']' 2025-03-23 20:22:29.268200 | orchestrator | ++ hash -r 2025-03-23 20:22:29.268214 | orchestrator | ++ '[' -n '' ']' 2025-03-23 20:22:29.268228 | orchestrator | ++ unset VIRTUAL_ENV 2025-03-23 20:22:29.268243 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2025-03-23 20:22:29.268257 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2025-03-23 20:22:29.268288 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2025-03-23 20:22:30.696139 | orchestrator | ++ '[' linux-gnu = msys ']' 2025-03-23 20:22:30.696278 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2025-03-23 20:22:30.696297 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2025-03-23 20:22:30.696313 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-23 20:22:30.696328 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-23 20:22:30.696341 | orchestrator | ++ export PATH 2025-03-23 20:22:30.696393 | orchestrator | ++ '[' -n '' ']' 2025-03-23 20:22:30.696408 | orchestrator | ++ '[' -z '' ']' 2025-03-23 20:22:30.696422 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2025-03-23 20:22:30.696436 | orchestrator | ++ PS1='(venv) ' 2025-03-23 20:22:30.696450 | orchestrator | ++ export PS1 2025-03-23 20:22:30.696464 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2025-03-23 20:22:30.696478 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2025-03-23 20:22:30.696492 | orchestrator | ++ hash -r 2025-03-23 20:22:30.696507 | orchestrator | + pip3 install --no-cache-dir python-gilt==1.2.3 requests Jinja2 PyYAML packaging 2025-03-23 20:22:30.696543 | orchestrator | Requirement already satisfied: python-gilt==1.2.3 in /opt/venv/lib/python3.12/site-packages (1.2.3) 2025-03-23 20:22:30.697244 | orchestrator | Requirement already satisfied: requests in /opt/venv/lib/python3.12/site-packages (2.32.3) 2025-03-23 20:22:30.698988 | orchestrator | Requirement already satisfied: Jinja2 in /opt/venv/lib/python3.12/site-packages (3.1.6) 2025-03-23 20:22:30.700508 | orchestrator | Requirement already satisfied: PyYAML in /opt/venv/lib/python3.12/site-packages (6.0.2) 2025-03-23 20:22:30.702603 | orchestrator | Requirement already satisfied: packaging in /opt/venv/lib/python3.12/site-packages (24.2) 2025-03-23 20:22:30.717099 | orchestrator | Requirement already satisfied: click in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (8.1.8) 2025-03-23 20:22:30.718786 | orchestrator | Requirement already satisfied: colorama in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (0.4.6) 2025-03-23 20:22:30.719892 | orchestrator | Requirement already satisfied: fasteners in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (0.19) 2025-03-23 20:22:30.721232 | orchestrator | Requirement already satisfied: sh in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (2.2.2) 2025-03-23 20:22:30.756843 | orchestrator | Requirement already satisfied: charset-normalizer<4,>=2 in /opt/venv/lib/python3.12/site-packages (from requests) (3.4.1) 2025-03-23 20:22:30.758225 | orchestrator | Requirement already satisfied: idna<4,>=2.5 in /opt/venv/lib/python3.12/site-packages (from requests) (3.10) 2025-03-23 20:22:30.759791 | orchestrator | Requirement already satisfied: urllib3<3,>=1.21.1 in /opt/venv/lib/python3.12/site-packages (from requests) (2.3.0) 2025-03-23 20:22:30.761594 | orchestrator | Requirement already satisfied: certifi>=2017.4.17 in /opt/venv/lib/python3.12/site-packages (from requests) (2025.1.31) 2025-03-23 20:22:30.765983 | orchestrator | Requirement already satisfied: MarkupSafe>=2.0 in /opt/venv/lib/python3.12/site-packages (from Jinja2) (3.0.2) 2025-03-23 20:22:31.000901 | orchestrator | ++ which gilt 2025-03-23 20:22:31.005932 | orchestrator | + GILT=/opt/venv/bin/gilt 2025-03-23 20:22:31.320169 | orchestrator | + /opt/venv/bin/gilt overlay 2025-03-23 20:22:31.320338 | orchestrator | osism.cfg-generics: 2025-03-23 20:22:32.805946 | orchestrator | - cloning osism.cfg-generics to /home/dragon/.gilt/clone/github.com/osism.cfg-generics 2025-03-23 20:22:32.806130 | orchestrator | - copied (main) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/environments/manager/images.yml to /opt/configuration/environments/manager/ 2025-03-23 20:22:33.863745 | orchestrator | - copied (main) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/src/render-images.py to /opt/configuration/environments/manager/ 2025-03-23 20:22:33.863867 | orchestrator | - copied (main) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/src/set-versions.py to /opt/configuration/environments/ 2025-03-23 20:22:33.863886 | orchestrator | - running `/opt/configuration/scripts/wrapper-gilt.sh render-images` in /opt/configuration/environments/manager/ 2025-03-23 20:22:33.863921 | orchestrator | - running `rm render-images.py` in /opt/configuration/environments/manager/ 2025-03-23 20:22:33.872738 | orchestrator | - running `/opt/configuration/scripts/wrapper-gilt.sh set-versions` in /opt/configuration/environments/ 2025-03-23 20:22:34.424321 | orchestrator | - running `rm set-versions.py` in /opt/configuration/environments/ 2025-03-23 20:22:34.479084 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-03-23 20:22:34.479129 | orchestrator | + deactivate 2025-03-23 20:22:34.479174 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2025-03-23 20:22:34.479191 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-23 20:22:34.479203 | orchestrator | + export PATH 2025-03-23 20:22:34.479228 | orchestrator | + unset _OLD_VIRTUAL_PATH 2025-03-23 20:22:34.479303 | orchestrator | + '[' -n '' ']' 2025-03-23 20:22:34.479694 | orchestrator | + hash -r 2025-03-23 20:22:34.479845 | orchestrator | + '[' -n '' ']' 2025-03-23 20:22:34.479864 | orchestrator | + unset VIRTUAL_ENV 2025-03-23 20:22:34.479880 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2025-03-23 20:22:34.479895 | orchestrator | + '[' '!' '' = nondestructive ']' 2025-03-23 20:22:34.479910 | orchestrator | + unset -f deactivate 2025-03-23 20:22:34.479933 | orchestrator | ~ 2025-03-23 20:22:34.481896 | orchestrator | + popd 2025-03-23 20:22:34.481929 | orchestrator | + [[ 8.1.0 == \l\a\t\e\s\t ]] 2025-03-23 20:22:34.483084 | orchestrator | + [[ ceph-ansible == \r\o\o\k ]] 2025-03-23 20:22:34.483126 | orchestrator | ++ semver 8.1.0 7.0.0 2025-03-23 20:22:34.543538 | orchestrator | + [[ 1 -ge 0 ]] 2025-03-23 20:22:34.590279 | orchestrator | + echo 'enable_osism_kubernetes: true' 2025-03-23 20:22:34.590322 | orchestrator | + /opt/configuration/scripts/enable-resource-nodes.sh 2025-03-23 20:22:34.590383 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-03-23 20:22:34.590486 | orchestrator | + source /opt/venv/bin/activate 2025-03-23 20:22:34.590504 | orchestrator | ++ deactivate nondestructive 2025-03-23 20:22:34.590536 | orchestrator | ++ '[' -n '' ']' 2025-03-23 20:22:34.590552 | orchestrator | ++ '[' -n '' ']' 2025-03-23 20:22:34.590567 | orchestrator | ++ hash -r 2025-03-23 20:22:34.590581 | orchestrator | ++ '[' -n '' ']' 2025-03-23 20:22:34.590595 | orchestrator | ++ unset VIRTUAL_ENV 2025-03-23 20:22:34.590609 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2025-03-23 20:22:34.590630 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2025-03-23 20:22:34.590733 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2025-03-23 20:22:34.590750 | orchestrator | ++ '[' linux-gnu = msys ']' 2025-03-23 20:22:34.590764 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2025-03-23 20:22:34.590779 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2025-03-23 20:22:34.590793 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-23 20:22:34.590808 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-23 20:22:34.590823 | orchestrator | ++ export PATH 2025-03-23 20:22:34.590837 | orchestrator | ++ '[' -n '' ']' 2025-03-23 20:22:34.590851 | orchestrator | ++ '[' -z '' ']' 2025-03-23 20:22:34.590865 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2025-03-23 20:22:34.590879 | orchestrator | ++ PS1='(venv) ' 2025-03-23 20:22:34.590893 | orchestrator | ++ export PS1 2025-03-23 20:22:34.590907 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2025-03-23 20:22:34.590921 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2025-03-23 20:22:34.590936 | orchestrator | ++ hash -r 2025-03-23 20:22:34.590955 | orchestrator | + ansible-playbook -i testbed-manager, --vault-password-file /opt/configuration/environments/.vault_pass /opt/configuration/ansible/manager-part-3.yml 2025-03-23 20:22:36.058293 | orchestrator | 2025-03-23 20:22:36.708759 | orchestrator | PLAY [Copy custom facts] ******************************************************* 2025-03-23 20:22:36.708874 | orchestrator | 2025-03-23 20:22:36.708887 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-03-23 20:22:36.708912 | orchestrator | ok: [testbed-manager] 2025-03-23 20:22:37.816923 | orchestrator | 2025-03-23 20:22:37.817035 | orchestrator | TASK [Copy fact files] ********************************************************* 2025-03-23 20:22:37.817074 | orchestrator | changed: [testbed-manager] 2025-03-23 20:22:40.627478 | orchestrator | 2025-03-23 20:22:40.627587 | orchestrator | PLAY [Before the deployment of the manager] ************************************ 2025-03-23 20:22:40.627607 | orchestrator | 2025-03-23 20:22:40.627623 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-23 20:22:40.627656 | orchestrator | ok: [testbed-manager] 2025-03-23 20:22:46.914743 | orchestrator | 2025-03-23 20:22:46.914869 | orchestrator | TASK [Pull images] ************************************************************* 2025-03-23 20:22:46.914933 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/ara-server:1.7.2) 2025-03-23 20:23:43.325016 | orchestrator | changed: [testbed-manager] => (item=index.docker.io/library/mariadb:11.6.2) 2025-03-23 20:23:43.325158 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/ceph-ansible:8.1.0) 2025-03-23 20:23:43.325178 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/inventory-reconciler:8.1.0) 2025-03-23 20:23:43.325193 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/kolla-ansible:8.1.0) 2025-03-23 20:23:43.325209 | orchestrator | changed: [testbed-manager] => (item=index.docker.io/library/redis:7.4.1-alpine) 2025-03-23 20:23:43.325223 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/netbox:v4.1.7) 2025-03-23 20:23:43.325237 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/osism-ansible:8.1.0) 2025-03-23 20:23:43.325251 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/osism:0.20241219.2) 2025-03-23 20:23:43.325272 | orchestrator | changed: [testbed-manager] => (item=index.docker.io/library/postgres:16.6-alpine) 2025-03-23 20:23:43.325288 | orchestrator | changed: [testbed-manager] => (item=index.docker.io/library/traefik:v3.2.1) 2025-03-23 20:23:43.325340 | orchestrator | changed: [testbed-manager] => (item=index.docker.io/hashicorp/vault:1.18.2) 2025-03-23 20:23:43.325354 | orchestrator | 2025-03-23 20:23:43.325369 | orchestrator | TASK [Check status] ************************************************************ 2025-03-23 20:23:43.325403 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check status (120 retries left). 2025-03-23 20:23:43.381948 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check status (119 retries left). 2025-03-23 20:23:43.382112 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j424625646800.1564', 'results_file': '/home/dragon/.ansible_async/j424625646800.1564', 'changed': True, 'item': 'registry.osism.tech/osism/ara-server:1.7.2', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382150 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j755937403454.1589', 'results_file': '/home/dragon/.ansible_async/j755937403454.1589', 'changed': True, 'item': 'index.docker.io/library/mariadb:11.6.2', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382166 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check status (120 retries left). 2025-03-23 20:23:43.382184 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j541395771175.1614', 'results_file': '/home/dragon/.ansible_async/j541395771175.1614', 'changed': True, 'item': 'registry.osism.tech/osism/ceph-ansible:8.1.0', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382206 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j974788821945.1646', 'results_file': '/home/dragon/.ansible_async/j974788821945.1646', 'changed': True, 'item': 'registry.osism.tech/osism/inventory-reconciler:8.1.0', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382225 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check status (120 retries left). 2025-03-23 20:23:43.382240 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j553851823912.1678', 'results_file': '/home/dragon/.ansible_async/j553851823912.1678', 'changed': True, 'item': 'registry.osism.tech/osism/kolla-ansible:8.1.0', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382255 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j104623300679.1710', 'results_file': '/home/dragon/.ansible_async/j104623300679.1710', 'changed': True, 'item': 'index.docker.io/library/redis:7.4.1-alpine', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382269 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j589935392684.1744', 'results_file': '/home/dragon/.ansible_async/j589935392684.1744', 'changed': True, 'item': 'registry.osism.tech/osism/netbox:v4.1.7', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382346 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check status (120 retries left). 2025-03-23 20:23:43.382362 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j331886553353.1778', 'results_file': '/home/dragon/.ansible_async/j331886553353.1778', 'changed': True, 'item': 'registry.osism.tech/osism/osism-ansible:8.1.0', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382377 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j10118567373.1817', 'results_file': '/home/dragon/.ansible_async/j10118567373.1817', 'changed': True, 'item': 'registry.osism.tech/osism/osism:0.20241219.2', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382391 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j727148163996.1850', 'results_file': '/home/dragon/.ansible_async/j727148163996.1850', 'changed': True, 'item': 'index.docker.io/library/postgres:16.6-alpine', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382405 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j830294118892.1875', 'results_file': '/home/dragon/.ansible_async/j830294118892.1875', 'changed': True, 'item': 'index.docker.io/library/traefik:v3.2.1', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382419 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j494898924908.1907', 'results_file': '/home/dragon/.ansible_async/j494898924908.1907', 'changed': True, 'item': 'index.docker.io/hashicorp/vault:1.18.2', 'ansible_loop_var': 'item'}) 2025-03-23 20:23:43.382433 | orchestrator | 2025-03-23 20:23:43.382449 | orchestrator | TASK [Get /opt/manager-vars.sh] ************************************************ 2025-03-23 20:23:43.382478 | orchestrator | ok: [testbed-manager] 2025-03-23 20:23:43.880797 | orchestrator | 2025-03-23 20:23:43.880911 | orchestrator | TASK [Add ara_server_mariadb_volume_type parameter] **************************** 2025-03-23 20:23:43.880948 | orchestrator | changed: [testbed-manager] 2025-03-23 20:23:44.244278 | orchestrator | 2025-03-23 20:23:44.244414 | orchestrator | TASK [Add netbox_postgres_volume_type parameter] ******************************* 2025-03-23 20:23:44.244440 | orchestrator | changed: [testbed-manager] 2025-03-23 20:23:44.648172 | orchestrator | 2025-03-23 20:23:44.648278 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2025-03-23 20:23:44.648358 | orchestrator | changed: [testbed-manager] 2025-03-23 20:23:44.701252 | orchestrator | 2025-03-23 20:23:44.701280 | orchestrator | TASK [Use insecure glance configuration] *************************************** 2025-03-23 20:23:44.701325 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:23:45.074608 | orchestrator | 2025-03-23 20:23:45.074703 | orchestrator | TASK [Check if /etc/OTC_region exist] ****************************************** 2025-03-23 20:23:45.074737 | orchestrator | ok: [testbed-manager] 2025-03-23 20:23:45.274169 | orchestrator | 2025-03-23 20:23:45.274215 | orchestrator | TASK [Add nova_compute_virt_type parameter] ************************************ 2025-03-23 20:23:45.274240 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:23:47.402457 | orchestrator | 2025-03-23 20:23:47.402571 | orchestrator | PLAY [Apply role traefik & netbox] ********************************************* 2025-03-23 20:23:47.402587 | orchestrator | 2025-03-23 20:23:47.402600 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-23 20:23:47.402629 | orchestrator | ok: [testbed-manager] 2025-03-23 20:23:47.663449 | orchestrator | 2025-03-23 20:23:47.663546 | orchestrator | TASK [Apply traefik role] ****************************************************** 2025-03-23 20:23:47.663580 | orchestrator | 2025-03-23 20:23:47.770414 | orchestrator | TASK [osism.services.traefik : Include config tasks] *************************** 2025-03-23 20:23:47.770491 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/config.yml for testbed-manager 2025-03-23 20:23:49.039223 | orchestrator | 2025-03-23 20:23:49.039332 | orchestrator | TASK [osism.services.traefik : Create required directories] ******************** 2025-03-23 20:23:49.039358 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik) 2025-03-23 20:23:51.080011 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/certificates) 2025-03-23 20:23:51.080133 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/configuration) 2025-03-23 20:23:51.080144 | orchestrator | 2025-03-23 20:23:51.080153 | orchestrator | TASK [osism.services.traefik : Copy configuration files] *********************** 2025-03-23 20:23:51.080176 | orchestrator | changed: [testbed-manager] => (item=traefik.yml) 2025-03-23 20:23:51.787356 | orchestrator | changed: [testbed-manager] => (item=traefik.env) 2025-03-23 20:23:51.787465 | orchestrator | changed: [testbed-manager] => (item=certificates.yml) 2025-03-23 20:23:51.787482 | orchestrator | 2025-03-23 20:23:51.787496 | orchestrator | TASK [osism.services.traefik : Copy certificate cert files] ******************** 2025-03-23 20:23:51.787528 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:23:52.502657 | orchestrator | changed: [testbed-manager] 2025-03-23 20:23:52.502770 | orchestrator | 2025-03-23 20:23:52.502790 | orchestrator | TASK [osism.services.traefik : Copy certificate key files] ********************* 2025-03-23 20:23:52.502825 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:23:52.611845 | orchestrator | changed: [testbed-manager] 2025-03-23 20:23:52.611928 | orchestrator | 2025-03-23 20:23:52.611945 | orchestrator | TASK [osism.services.traefik : Copy dynamic configuration] ********************* 2025-03-23 20:23:52.611972 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:23:53.044423 | orchestrator | 2025-03-23 20:23:53.044541 | orchestrator | TASK [osism.services.traefik : Remove dynamic configuration] ******************* 2025-03-23 20:23:53.044578 | orchestrator | ok: [testbed-manager] 2025-03-23 20:23:53.164991 | orchestrator | 2025-03-23 20:23:53.165036 | orchestrator | TASK [osism.services.traefik : Include service tasks] ************************** 2025-03-23 20:23:53.165062 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/service.yml for testbed-manager 2025-03-23 20:23:54.290195 | orchestrator | 2025-03-23 20:23:54.290345 | orchestrator | TASK [osism.services.traefik : Create traefik external network] **************** 2025-03-23 20:23:54.290381 | orchestrator | changed: [testbed-manager] 2025-03-23 20:23:55.379213 | orchestrator | 2025-03-23 20:23:55.379353 | orchestrator | TASK [osism.services.traefik : Copy docker-compose.yml file] ******************* 2025-03-23 20:23:55.379393 | orchestrator | changed: [testbed-manager] 2025-03-23 20:23:58.819525 | orchestrator | 2025-03-23 20:23:58.819658 | orchestrator | TASK [osism.services.traefik : Manage traefik service] ************************* 2025-03-23 20:23:58.819702 | orchestrator | changed: [testbed-manager] 2025-03-23 20:23:59.164048 | orchestrator | 2025-03-23 20:23:59.164152 | orchestrator | TASK [Apply netbox role] ******************************************************* 2025-03-23 20:23:59.164188 | orchestrator | 2025-03-23 20:23:59.296388 | orchestrator | TASK [osism.services.netbox : Include install tasks] *************************** 2025-03-23 20:23:59.296485 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/install-Debian-family.yml for testbed-manager 2025-03-23 20:24:02.500504 | orchestrator | 2025-03-23 20:24:02.500612 | orchestrator | TASK [osism.services.netbox : Install required packages] *********************** 2025-03-23 20:24:02.500640 | orchestrator | ok: [testbed-manager] 2025-03-23 20:24:02.702904 | orchestrator | 2025-03-23 20:24:02.702955 | orchestrator | TASK [osism.services.netbox : Include config tasks] **************************** 2025-03-23 20:24:02.702975 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/config.yml for testbed-manager 2025-03-23 20:24:03.869919 | orchestrator | 2025-03-23 20:24:03.870056 | orchestrator | TASK [osism.services.netbox : Create required directories] ********************* 2025-03-23 20:24:03.870095 | orchestrator | changed: [testbed-manager] => (item=/opt/netbox) 2025-03-23 20:24:03.996750 | orchestrator | changed: [testbed-manager] => (item=/opt/netbox/configuration) 2025-03-23 20:24:03.996787 | orchestrator | changed: [testbed-manager] => (item=/opt/netbox/secrets) 2025-03-23 20:24:03.996797 | orchestrator | 2025-03-23 20:24:03.996807 | orchestrator | TASK [osism.services.netbox : Include postgres config tasks] ******************* 2025-03-23 20:24:03.996825 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/config-postgres.yml for testbed-manager 2025-03-23 20:24:04.661754 | orchestrator | 2025-03-23 20:24:04.661840 | orchestrator | TASK [osism.services.netbox : Copy postgres environment files] ***************** 2025-03-23 20:24:04.661864 | orchestrator | changed: [testbed-manager] => (item=postgres) 2025-03-23 20:24:05.368872 | orchestrator | 2025-03-23 20:24:05.368970 | orchestrator | TASK [osism.services.netbox : Copy secret files] ******************************* 2025-03-23 20:24:05.369006 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:24:05.821660 | orchestrator | changed: [testbed-manager] 2025-03-23 20:24:05.821743 | orchestrator | 2025-03-23 20:24:05.821754 | orchestrator | TASK [osism.services.netbox : Create docker-entrypoint-initdb.d directory] ***** 2025-03-23 20:24:05.821778 | orchestrator | changed: [testbed-manager] 2025-03-23 20:24:06.214482 | orchestrator | 2025-03-23 20:24:06.214606 | orchestrator | TASK [osism.services.netbox : Check if init.sql file exists] ******************* 2025-03-23 20:24:06.214647 | orchestrator | ok: [testbed-manager] 2025-03-23 20:24:06.279983 | orchestrator | 2025-03-23 20:24:06.280016 | orchestrator | TASK [osism.services.netbox : Copy init.sql file] ****************************** 2025-03-23 20:24:06.280038 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:24:07.022153 | orchestrator | 2025-03-23 20:24:07.022249 | orchestrator | TASK [osism.services.netbox : Create init-netbox-database.sh script] *********** 2025-03-23 20:24:07.022278 | orchestrator | changed: [testbed-manager] 2025-03-23 20:24:07.146956 | orchestrator | 2025-03-23 20:24:07.147002 | orchestrator | TASK [osism.services.netbox : Include config tasks] **************************** 2025-03-23 20:24:07.147024 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/config-netbox.yml for testbed-manager 2025-03-23 20:24:08.001854 | orchestrator | 2025-03-23 20:24:08.001968 | orchestrator | TASK [osism.services.netbox : Create directories required by netbox] *********** 2025-03-23 20:24:08.002001 | orchestrator | changed: [testbed-manager] => (item=/opt/netbox/configuration/initializers) 2025-03-23 20:24:08.720163 | orchestrator | changed: [testbed-manager] => (item=/opt/netbox/configuration/startup-scripts) 2025-03-23 20:24:08.720259 | orchestrator | 2025-03-23 20:24:08.720272 | orchestrator | TASK [osism.services.netbox : Copy netbox environment files] ******************* 2025-03-23 20:24:08.720326 | orchestrator | changed: [testbed-manager] => (item=netbox) 2025-03-23 20:24:09.457522 | orchestrator | 2025-03-23 20:24:09.457640 | orchestrator | TASK [osism.services.netbox : Copy netbox configuration file] ****************** 2025-03-23 20:24:09.457673 | orchestrator | changed: [testbed-manager] 2025-03-23 20:24:09.510168 | orchestrator | 2025-03-23 20:24:09.510217 | orchestrator | TASK [osism.services.netbox : Copy nginx unit configuration file (<= 1.26)] **** 2025-03-23 20:24:09.510243 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:24:10.206344 | orchestrator | 2025-03-23 20:24:10.206469 | orchestrator | TASK [osism.services.netbox : Copy nginx unit configuration file (> 1.26)] ***** 2025-03-23 20:24:10.206508 | orchestrator | changed: [testbed-manager] 2025-03-23 20:24:12.236952 | orchestrator | 2025-03-23 20:24:12.237067 | orchestrator | TASK [osism.services.netbox : Copy secret files] ******************************* 2025-03-23 20:24:12.237100 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:24:18.700663 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:24:18.700786 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:24:18.700804 | orchestrator | changed: [testbed-manager] 2025-03-23 20:24:18.700819 | orchestrator | 2025-03-23 20:24:18.700834 | orchestrator | TASK [osism.services.netbox : Deploy initializers for netbox] ****************** 2025-03-23 20:24:18.700862 | orchestrator | changed: [testbed-manager] => (item=custom_fields) 2025-03-23 20:24:19.376414 | orchestrator | changed: [testbed-manager] => (item=device_roles) 2025-03-23 20:24:19.376536 | orchestrator | changed: [testbed-manager] => (item=device_types) 2025-03-23 20:24:19.376555 | orchestrator | changed: [testbed-manager] => (item=groups) 2025-03-23 20:24:19.376570 | orchestrator | changed: [testbed-manager] => (item=manufacturers) 2025-03-23 20:24:19.376586 | orchestrator | changed: [testbed-manager] => (item=object_permissions) 2025-03-23 20:24:19.376600 | orchestrator | changed: [testbed-manager] => (item=prefix_vlan_roles) 2025-03-23 20:24:19.376614 | orchestrator | changed: [testbed-manager] => (item=sites) 2025-03-23 20:24:19.376628 | orchestrator | changed: [testbed-manager] => (item=tags) 2025-03-23 20:24:19.376642 | orchestrator | changed: [testbed-manager] => (item=users) 2025-03-23 20:24:19.376656 | orchestrator | 2025-03-23 20:24:19.376672 | orchestrator | TASK [osism.services.netbox : Deploy startup scripts for netbox] *************** 2025-03-23 20:24:19.376704 | orchestrator | changed: [testbed-manager] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/files/startup-scripts/270_tags.py) 2025-03-23 20:24:19.566555 | orchestrator | 2025-03-23 20:24:19.566671 | orchestrator | TASK [osism.services.netbox : Include service tasks] *************************** 2025-03-23 20:24:19.566706 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/service.yml for testbed-manager 2025-03-23 20:24:20.323333 | orchestrator | 2025-03-23 20:24:20.323415 | orchestrator | TASK [osism.services.netbox : Copy netbox systemd unit file] ******************* 2025-03-23 20:24:20.323447 | orchestrator | changed: [testbed-manager] 2025-03-23 20:24:21.029413 | orchestrator | 2025-03-23 20:24:21.029517 | orchestrator | TASK [osism.services.netbox : Create traefik external network] ***************** 2025-03-23 20:24:21.029552 | orchestrator | ok: [testbed-manager] 2025-03-23 20:24:21.863799 | orchestrator | 2025-03-23 20:24:21.863914 | orchestrator | TASK [osism.services.netbox : Copy docker-compose.yml file] ******************** 2025-03-23 20:24:21.863950 | orchestrator | changed: [testbed-manager] 2025-03-23 20:24:26.477338 | orchestrator | 2025-03-23 20:24:26.477466 | orchestrator | TASK [osism.services.netbox : Pull container images] *************************** 2025-03-23 20:24:26.477501 | orchestrator | changed: [testbed-manager] 2025-03-23 20:24:27.501984 | orchestrator | 2025-03-23 20:24:27.502172 | orchestrator | TASK [osism.services.netbox : Stop and disable old service docker-compose@netbox] *** 2025-03-23 20:24:27.502211 | orchestrator | ok: [testbed-manager] 2025-03-23 20:24:49.738350 | orchestrator | 2025-03-23 20:24:49.738488 | orchestrator | TASK [osism.services.netbox : Manage netbox service] *************************** 2025-03-23 20:24:49.738528 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage netbox service (10 retries left). 2025-03-23 20:24:49.824082 | orchestrator | ok: [testbed-manager] 2025-03-23 20:24:49.824174 | orchestrator | 2025-03-23 20:24:49.824194 | orchestrator | TASK [osism.services.netbox : Register that netbox service was started] ******** 2025-03-23 20:24:49.824226 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:24:49.879109 | orchestrator | 2025-03-23 20:24:49.879144 | orchestrator | TASK [osism.services.netbox : Flush handlers] ********************************** 2025-03-23 20:24:49.879159 | orchestrator | 2025-03-23 20:24:49.879173 | orchestrator | RUNNING HANDLER [osism.services.traefik : Restart traefik service] ************* 2025-03-23 20:24:49.879195 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:24:49.982310 | orchestrator | 2025-03-23 20:24:49.982434 | orchestrator | RUNNING HANDLER [osism.services.netbox : Restart netbox service] *************** 2025-03-23 20:24:49.982468 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/restart-service.yml for testbed-manager 2025-03-23 20:24:50.934826 | orchestrator | 2025-03-23 20:24:50.934944 | orchestrator | RUNNING HANDLER [osism.services.netbox : Get infos on postgres container] ****** 2025-03-23 20:24:50.934980 | orchestrator | ok: [testbed-manager] 2025-03-23 20:24:51.025143 | orchestrator | 2025-03-23 20:24:51.025256 | orchestrator | RUNNING HANDLER [osism.services.netbox : Set postgres container version fact] *** 2025-03-23 20:24:51.025352 | orchestrator | ok: [testbed-manager] 2025-03-23 20:24:51.081359 | orchestrator | 2025-03-23 20:24:51.081394 | orchestrator | RUNNING HANDLER [osism.services.netbox : Print major version of postgres container] *** 2025-03-23 20:24:51.081417 | orchestrator | ok: [testbed-manager] => { 2025-03-23 20:24:51.884389 | orchestrator | "msg": "The major version of the running postgres container is 16" 2025-03-23 20:24:51.884504 | orchestrator | } 2025-03-23 20:24:51.884522 | orchestrator | 2025-03-23 20:24:51.884537 | orchestrator | RUNNING HANDLER [osism.services.netbox : Pull postgres image] ****************** 2025-03-23 20:24:51.884569 | orchestrator | ok: [testbed-manager] 2025-03-23 20:24:52.948912 | orchestrator | 2025-03-23 20:24:52.949026 | orchestrator | RUNNING HANDLER [osism.services.netbox : Get infos on postgres image] ********** 2025-03-23 20:24:52.949065 | orchestrator | ok: [testbed-manager] 2025-03-23 20:24:53.043177 | orchestrator | 2025-03-23 20:24:53.043321 | orchestrator | RUNNING HANDLER [osism.services.netbox : Set postgres image version fact] ****** 2025-03-23 20:24:53.043356 | orchestrator | ok: [testbed-manager] 2025-03-23 20:24:53.113161 | orchestrator | 2025-03-23 20:24:53.113219 | orchestrator | RUNNING HANDLER [osism.services.netbox : Print major version of postgres image] *** 2025-03-23 20:24:53.113248 | orchestrator | ok: [testbed-manager] => { 2025-03-23 20:24:53.187880 | orchestrator | "msg": "The major version of the postgres image is 16" 2025-03-23 20:24:53.187945 | orchestrator | } 2025-03-23 20:24:53.187961 | orchestrator | 2025-03-23 20:24:53.187975 | orchestrator | RUNNING HANDLER [osism.services.netbox : Stop netbox service] ****************** 2025-03-23 20:24:53.188010 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:24:53.271100 | orchestrator | 2025-03-23 20:24:53.271135 | orchestrator | RUNNING HANDLER [osism.services.netbox : Wait for netbox service to stop] ****** 2025-03-23 20:24:53.271156 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:24:53.342417 | orchestrator | 2025-03-23 20:24:53.342506 | orchestrator | RUNNING HANDLER [osism.services.netbox : Get infos on postgres volume] ********* 2025-03-23 20:24:53.342541 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:24:53.426466 | orchestrator | 2025-03-23 20:24:53.426530 | orchestrator | RUNNING HANDLER [osism.services.netbox : Upgrade postgres database] ************ 2025-03-23 20:24:53.426553 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:24:53.503396 | orchestrator | 2025-03-23 20:24:53.503495 | orchestrator | RUNNING HANDLER [osism.services.netbox : Remove netbox-pgautoupgrade container] *** 2025-03-23 20:24:53.503527 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:24:53.628536 | orchestrator | 2025-03-23 20:24:53.628586 | orchestrator | RUNNING HANDLER [osism.services.netbox : Start netbox service] ***************** 2025-03-23 20:24:53.628612 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:24:54.937031 | orchestrator | 2025-03-23 20:24:54.937144 | orchestrator | RUNNING HANDLER [osism.services.netbox : Restart netbox service] *************** 2025-03-23 20:24:54.937179 | orchestrator | changed: [testbed-manager] 2025-03-23 20:24:55.057525 | orchestrator | 2025-03-23 20:24:55.057601 | orchestrator | RUNNING HANDLER [osism.services.netbox : Register that netbox service was started] *** 2025-03-23 20:24:55.057635 | orchestrator | ok: [testbed-manager] 2025-03-23 20:25:55.127128 | orchestrator | 2025-03-23 20:25:55.127393 | orchestrator | RUNNING HANDLER [osism.services.netbox : Wait for netbox service to start] ***** 2025-03-23 20:25:55.127463 | orchestrator | Pausing for 60 seconds 2025-03-23 20:25:55.231863 | orchestrator | changed: [testbed-manager] 2025-03-23 20:25:55.231957 | orchestrator | 2025-03-23 20:25:55.231976 | orchestrator | RUNNING HANDLER [osism.services.netbox : Wait for an healthy netbox service] *** 2025-03-23 20:25:55.232007 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/wait-for-healthy-service.yml for testbed-manager 2025-03-23 20:30:41.080526 | orchestrator | 2025-03-23 20:30:41.080661 | orchestrator | RUNNING HANDLER [osism.services.netbox : Check that all containers are in a good state] *** 2025-03-23 20:30:41.080698 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (60 retries left). 2025-03-23 20:30:43.581372 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (59 retries left). 2025-03-23 20:30:43.581514 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (58 retries left). 2025-03-23 20:30:43.581546 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (57 retries left). 2025-03-23 20:30:43.581564 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (56 retries left). 2025-03-23 20:30:43.581579 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (55 retries left). 2025-03-23 20:30:43.581594 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (54 retries left). 2025-03-23 20:30:43.581608 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (53 retries left). 2025-03-23 20:30:43.581622 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (52 retries left). 2025-03-23 20:30:43.581640 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (51 retries left). 2025-03-23 20:30:43.581664 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (50 retries left). 2025-03-23 20:30:43.581686 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (49 retries left). 2025-03-23 20:30:43.581709 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (48 retries left). 2025-03-23 20:30:43.581732 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (47 retries left). 2025-03-23 20:30:43.581796 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (46 retries left). 2025-03-23 20:30:43.581813 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (45 retries left). 2025-03-23 20:30:43.581830 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (44 retries left). 2025-03-23 20:30:43.581853 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (43 retries left). 2025-03-23 20:30:43.581877 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (42 retries left). 2025-03-23 20:30:43.581913 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (41 retries left). 2025-03-23 20:30:43.581938 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (40 retries left). 2025-03-23 20:30:43.581960 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (39 retries left). 2025-03-23 20:30:43.581983 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (38 retries left). 2025-03-23 20:30:43.582006 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (37 retries left). 2025-03-23 20:30:43.582108 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (36 retries left). 2025-03-23 20:30:43.582134 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (35 retries left). 2025-03-23 20:30:43.582185 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (34 retries left). 2025-03-23 20:30:43.582210 | orchestrator | changed: [testbed-manager] 2025-03-23 20:30:43.582234 | orchestrator | 2025-03-23 20:30:43.582260 | orchestrator | PLAY [Deploy manager service] ************************************************** 2025-03-23 20:30:43.582283 | orchestrator | 2025-03-23 20:30:43.582306 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-23 20:30:43.582349 | orchestrator | ok: [testbed-manager] 2025-03-23 20:30:43.725336 | orchestrator | 2025-03-23 20:30:43.725464 | orchestrator | TASK [Apply manager role] ****************************************************** 2025-03-23 20:30:43.725502 | orchestrator | 2025-03-23 20:30:43.792099 | orchestrator | TASK [osism.services.manager : Include install tasks] ************************** 2025-03-23 20:30:43.792186 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/install-Debian-family.yml for testbed-manager 2025-03-23 20:30:45.881669 | orchestrator | 2025-03-23 20:30:45.881783 | orchestrator | TASK [osism.services.manager : Install required packages] ********************** 2025-03-23 20:30:45.881816 | orchestrator | ok: [testbed-manager] 2025-03-23 20:30:45.941270 | orchestrator | 2025-03-23 20:30:45.941320 | orchestrator | TASK [osism.services.manager : Gather variables for each operating system] ***** 2025-03-23 20:30:45.941347 | orchestrator | ok: [testbed-manager] 2025-03-23 20:30:46.075840 | orchestrator | 2025-03-23 20:30:46.075939 | orchestrator | TASK [osism.services.manager : Include config tasks] *************************** 2025-03-23 20:30:46.075973 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config.yml for testbed-manager 2025-03-23 20:30:49.152904 | orchestrator | 2025-03-23 20:30:49.153038 | orchestrator | TASK [osism.services.manager : Create required directories] ******************** 2025-03-23 20:30:49.153075 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible) 2025-03-23 20:30:49.865077 | orchestrator | changed: [testbed-manager] => (item=/opt/archive) 2025-03-23 20:30:49.865234 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/configuration) 2025-03-23 20:30:49.865255 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/data) 2025-03-23 20:30:49.865271 | orchestrator | ok: [testbed-manager] => (item=/opt/manager) 2025-03-23 20:30:49.865286 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/secrets) 2025-03-23 20:30:49.865302 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible/secrets) 2025-03-23 20:30:49.865317 | orchestrator | changed: [testbed-manager] => (item=/opt/state) 2025-03-23 20:30:49.865362 | orchestrator | 2025-03-23 20:30:49.865379 | orchestrator | TASK [osism.services.manager : Copy client environment file] ******************* 2025-03-23 20:30:49.865412 | orchestrator | changed: [testbed-manager] 2025-03-23 20:30:49.976386 | orchestrator | 2025-03-23 20:30:49.976444 | orchestrator | TASK [osism.services.manager : Include ara config tasks] *********************** 2025-03-23 20:30:49.976473 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ara.yml for testbed-manager 2025-03-23 20:30:51.296073 | orchestrator | 2025-03-23 20:30:51.296241 | orchestrator | TASK [osism.services.manager : Copy ARA environment files] ********************* 2025-03-23 20:30:51.296280 | orchestrator | changed: [testbed-manager] => (item=ara) 2025-03-23 20:30:51.975244 | orchestrator | changed: [testbed-manager] => (item=ara-server) 2025-03-23 20:30:51.975358 | orchestrator | 2025-03-23 20:30:51.975376 | orchestrator | TASK [osism.services.manager : Copy MariaDB environment file] ****************** 2025-03-23 20:30:51.975407 | orchestrator | changed: [testbed-manager] 2025-03-23 20:30:52.054492 | orchestrator | 2025-03-23 20:30:52.054598 | orchestrator | TASK [osism.services.manager : Include vault config tasks] ********************* 2025-03-23 20:30:52.054636 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:30:52.141429 | orchestrator | 2025-03-23 20:30:52.141494 | orchestrator | TASK [osism.services.manager : Include ansible config tasks] ******************* 2025-03-23 20:30:52.141522 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ansible.yml for testbed-manager 2025-03-23 20:30:53.805656 | orchestrator | 2025-03-23 20:30:53.805754 | orchestrator | TASK [osism.services.manager : Copy private ssh keys] ************************** 2025-03-23 20:30:53.805777 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:30:54.489890 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:30:54.490070 | orchestrator | changed: [testbed-manager] 2025-03-23 20:30:54.490770 | orchestrator | 2025-03-23 20:30:54.490796 | orchestrator | TASK [osism.services.manager : Copy ansible environment file] ****************** 2025-03-23 20:30:54.490826 | orchestrator | changed: [testbed-manager] 2025-03-23 20:30:54.590793 | orchestrator | 2025-03-23 20:30:54.590850 | orchestrator | TASK [osism.services.manager : Include netbox config tasks] ******************** 2025-03-23 20:30:54.590876 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-netbox.yml for testbed-manager 2025-03-23 20:30:55.286478 | orchestrator | 2025-03-23 20:30:55.286575 | orchestrator | TASK [osism.services.manager : Copy secret files] ****************************** 2025-03-23 20:30:55.286605 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:30:55.979281 | orchestrator | changed: [testbed-manager] 2025-03-23 20:30:55.979385 | orchestrator | 2025-03-23 20:30:55.979401 | orchestrator | TASK [osism.services.manager : Copy netbox environment file] ******************* 2025-03-23 20:30:55.979429 | orchestrator | changed: [testbed-manager] 2025-03-23 20:30:56.105213 | orchestrator | 2025-03-23 20:30:56.105292 | orchestrator | TASK [osism.services.manager : Include celery config tasks] ******************** 2025-03-23 20:30:56.105322 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-celery.yml for testbed-manager 2025-03-23 20:30:56.774196 | orchestrator | 2025-03-23 20:30:56.774301 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_watches] **************** 2025-03-23 20:30:56.774329 | orchestrator | changed: [testbed-manager] 2025-03-23 20:30:57.193599 | orchestrator | 2025-03-23 20:30:57.193716 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_instances] ************** 2025-03-23 20:30:57.193762 | orchestrator | changed: [testbed-manager] 2025-03-23 20:30:58.569355 | orchestrator | 2025-03-23 20:30:58.569470 | orchestrator | TASK [osism.services.manager : Copy celery environment files] ****************** 2025-03-23 20:30:58.569502 | orchestrator | changed: [testbed-manager] => (item=conductor) 2025-03-23 20:30:59.281255 | orchestrator | changed: [testbed-manager] => (item=openstack) 2025-03-23 20:30:59.281347 | orchestrator | 2025-03-23 20:30:59.281366 | orchestrator | TASK [osism.services.manager : Copy listener environment file] ***************** 2025-03-23 20:30:59.281396 | orchestrator | changed: [testbed-manager] 2025-03-23 20:30:59.625740 | orchestrator | 2025-03-23 20:30:59.625822 | orchestrator | TASK [osism.services.manager : Check for conductor.yml] ************************ 2025-03-23 20:30:59.625881 | orchestrator | ok: [testbed-manager] 2025-03-23 20:30:59.739881 | orchestrator | 2025-03-23 20:30:59.739938 | orchestrator | TASK [osism.services.manager : Copy conductor configuration file] ************** 2025-03-23 20:30:59.739965 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:31:00.415712 | orchestrator | 2025-03-23 20:31:00.415786 | orchestrator | TASK [osism.services.manager : Copy empty conductor configuration file] ******** 2025-03-23 20:31:00.415804 | orchestrator | changed: [testbed-manager] 2025-03-23 20:31:00.508553 | orchestrator | 2025-03-23 20:31:00.508576 | orchestrator | TASK [osism.services.manager : Include wrapper config tasks] ******************* 2025-03-23 20:31:00.508587 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-wrapper.yml for testbed-manager 2025-03-23 20:31:00.570134 | orchestrator | 2025-03-23 20:31:00.570171 | orchestrator | TASK [osism.services.manager : Include wrapper vars file] ********************** 2025-03-23 20:31:00.570182 | orchestrator | ok: [testbed-manager] 2025-03-23 20:31:02.768929 | orchestrator | 2025-03-23 20:31:02.769044 | orchestrator | TASK [osism.services.manager : Copy wrapper scripts] *************************** 2025-03-23 20:31:02.769078 | orchestrator | changed: [testbed-manager] => (item=osism) 2025-03-23 20:31:03.550605 | orchestrator | changed: [testbed-manager] => (item=osism-update-docker) 2025-03-23 20:31:03.550688 | orchestrator | changed: [testbed-manager] => (item=osism-update-manager) 2025-03-23 20:31:03.550702 | orchestrator | 2025-03-23 20:31:03.550715 | orchestrator | TASK [osism.services.manager : Copy cilium wrapper script] ********************* 2025-03-23 20:31:03.550743 | orchestrator | changed: [testbed-manager] 2025-03-23 20:31:03.660475 | orchestrator | 2025-03-23 20:31:03.660528 | orchestrator | TASK [osism.services.manager : Include scripts config tasks] ******************* 2025-03-23 20:31:03.660554 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-scripts.yml for testbed-manager 2025-03-23 20:31:03.705045 | orchestrator | 2025-03-23 20:31:03.705076 | orchestrator | TASK [osism.services.manager : Include scripts vars file] ********************** 2025-03-23 20:31:03.705098 | orchestrator | ok: [testbed-manager] 2025-03-23 20:31:04.499329 | orchestrator | 2025-03-23 20:31:04.499425 | orchestrator | TASK [osism.services.manager : Copy scripts] *********************************** 2025-03-23 20:31:04.499459 | orchestrator | changed: [testbed-manager] => (item=osism-include) 2025-03-23 20:31:04.593066 | orchestrator | 2025-03-23 20:31:04.593166 | orchestrator | TASK [osism.services.manager : Include service tasks] ************************** 2025-03-23 20:31:04.593198 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/service.yml for testbed-manager 2025-03-23 20:31:05.351814 | orchestrator | 2025-03-23 20:31:05.351919 | orchestrator | TASK [osism.services.manager : Copy manager systemd unit file] ***************** 2025-03-23 20:31:05.351954 | orchestrator | changed: [testbed-manager] 2025-03-23 20:31:06.055860 | orchestrator | 2025-03-23 20:31:06.055976 | orchestrator | TASK [osism.services.manager : Create traefik external network] **************** 2025-03-23 20:31:06.056012 | orchestrator | ok: [testbed-manager] 2025-03-23 20:31:06.113222 | orchestrator | 2025-03-23 20:31:06.113255 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb < 11.0.0] *** 2025-03-23 20:31:06.113279 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:31:06.173576 | orchestrator | 2025-03-23 20:31:06.173600 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb >= 11.0.0] *** 2025-03-23 20:31:06.173621 | orchestrator | ok: [testbed-manager] 2025-03-23 20:31:07.095121 | orchestrator | 2025-03-23 20:31:07.095269 | orchestrator | TASK [osism.services.manager : Copy docker-compose.yml file] ******************* 2025-03-23 20:31:07.095302 | orchestrator | changed: [testbed-manager] 2025-03-23 20:31:30.886104 | orchestrator | 2025-03-23 20:31:30.886271 | orchestrator | TASK [osism.services.manager : Pull container images] ************************** 2025-03-23 20:31:30.886310 | orchestrator | changed: [testbed-manager] 2025-03-23 20:31:31.640002 | orchestrator | 2025-03-23 20:31:31.640108 | orchestrator | TASK [osism.services.manager : Stop and disable old service docker-compose@manager] *** 2025-03-23 20:31:31.640163 | orchestrator | ok: [testbed-manager] 2025-03-23 20:31:34.413206 | orchestrator | 2025-03-23 20:31:34.413333 | orchestrator | TASK [osism.services.manager : Manage manager service] ************************* 2025-03-23 20:31:34.413371 | orchestrator | changed: [testbed-manager] 2025-03-23 20:31:34.491303 | orchestrator | 2025-03-23 20:31:34.491359 | orchestrator | TASK [osism.services.manager : Register that manager service was started] ****** 2025-03-23 20:31:34.491386 | orchestrator | ok: [testbed-manager] 2025-03-23 20:31:34.559583 | orchestrator | 2025-03-23 20:31:34.559648 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2025-03-23 20:31:34.559663 | orchestrator | 2025-03-23 20:31:34.559678 | orchestrator | RUNNING HANDLER [osism.services.manager : Restart manager service] ************* 2025-03-23 20:31:34.559703 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:32:34.610938 | orchestrator | 2025-03-23 20:32:34.611110 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for manager service to start] *** 2025-03-23 20:32:34.611180 | orchestrator | Pausing for 60 seconds 2025-03-23 20:32:41.878861 | orchestrator | changed: [testbed-manager] 2025-03-23 20:32:41.879020 | orchestrator | 2025-03-23 20:32:41.879045 | orchestrator | RUNNING HANDLER [osism.services.manager : Ensure that all containers are up] *** 2025-03-23 20:32:41.879080 | orchestrator | changed: [testbed-manager] 2025-03-23 20:33:24.163913 | orchestrator | 2025-03-23 20:33:24.164046 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for an healthy manager service] *** 2025-03-23 20:33:24.164084 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (50 retries left). 2025-03-23 20:33:31.135952 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (49 retries left). 2025-03-23 20:33:31.136082 | orchestrator | changed: [testbed-manager] 2025-03-23 20:33:31.136124 | orchestrator | 2025-03-23 20:33:31.136142 | orchestrator | RUNNING HANDLER [osism.services.manager : Copy osismclient bash completion script] *** 2025-03-23 20:33:31.136173 | orchestrator | changed: [testbed-manager] 2025-03-23 20:33:31.235402 | orchestrator | 2025-03-23 20:33:31.235459 | orchestrator | TASK [osism.services.manager : Include initialize tasks] *********************** 2025-03-23 20:33:31.235489 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/initialize.yml for testbed-manager 2025-03-23 20:33:31.313323 | orchestrator | 2025-03-23 20:33:31.313365 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2025-03-23 20:33:31.313380 | orchestrator | 2025-03-23 20:33:31.313407 | orchestrator | TASK [osism.services.manager : Include vault initialize tasks] ***************** 2025-03-23 20:33:31.313431 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:33:31.499388 | orchestrator | 2025-03-23 20:33:31.499440 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:33:31.499456 | orchestrator | testbed-manager : ok=103 changed=55 unreachable=0 failed=0 skipped=18 rescued=0 ignored=0 2025-03-23 20:33:31.499471 | orchestrator | 2025-03-23 20:33:31.499495 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-03-23 20:33:31.508531 | orchestrator | + deactivate 2025-03-23 20:33:31.508558 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2025-03-23 20:33:31.508574 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-23 20:33:31.508589 | orchestrator | + export PATH 2025-03-23 20:33:31.508603 | orchestrator | + unset _OLD_VIRTUAL_PATH 2025-03-23 20:33:31.508617 | orchestrator | + '[' -n '' ']' 2025-03-23 20:33:31.508630 | orchestrator | + hash -r 2025-03-23 20:33:31.508644 | orchestrator | + '[' -n '' ']' 2025-03-23 20:33:31.508658 | orchestrator | + unset VIRTUAL_ENV 2025-03-23 20:33:31.508672 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2025-03-23 20:33:31.508686 | orchestrator | + '[' '!' '' = nondestructive ']' 2025-03-23 20:33:31.508700 | orchestrator | + unset -f deactivate 2025-03-23 20:33:31.508714 | orchestrator | + cp /home/dragon/.ssh/id_rsa.pub /opt/ansible/secrets/id_rsa.operator.pub 2025-03-23 20:33:31.508735 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2025-03-23 20:33:31.509731 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2025-03-23 20:33:31.509754 | orchestrator | + local max_attempts=60 2025-03-23 20:33:31.509769 | orchestrator | + local name=ceph-ansible 2025-03-23 20:33:31.509783 | orchestrator | + local attempt_num=1 2025-03-23 20:33:31.509802 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-03-23 20:33:31.546268 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-23 20:33:31.546989 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2025-03-23 20:33:31.547013 | orchestrator | + local max_attempts=60 2025-03-23 20:33:31.547062 | orchestrator | + local name=kolla-ansible 2025-03-23 20:33:31.547088 | orchestrator | + local attempt_num=1 2025-03-23 20:33:31.547133 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2025-03-23 20:33:31.580993 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-23 20:33:31.581598 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2025-03-23 20:33:31.581621 | orchestrator | + local max_attempts=60 2025-03-23 20:33:31.581637 | orchestrator | + local name=osism-ansible 2025-03-23 20:33:31.581651 | orchestrator | + local attempt_num=1 2025-03-23 20:33:31.581669 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2025-03-23 20:33:31.607322 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-23 20:33:32.945678 | orchestrator | + [[ true == \t\r\u\e ]] 2025-03-23 20:33:32.945801 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2025-03-23 20:33:32.945840 | orchestrator | ++ semver 8.1.0 9.0.0 2025-03-23 20:33:33.008035 | orchestrator | + [[ -1 -ge 0 ]] 2025-03-23 20:33:33.317084 | orchestrator | + [[ 8.1.0 == \l\a\t\e\s\t ]] 2025-03-23 20:33:33.317213 | orchestrator | + docker compose --project-directory /opt/manager ps 2025-03-23 20:33:33.317248 | orchestrator | NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS 2025-03-23 20:33:33.322775 | orchestrator | ceph-ansible registry.osism.tech/osism/ceph-ansible:8.1.0 "/entrypoint.sh osis…" ceph-ansible About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.322812 | orchestrator | kolla-ansible registry.osism.tech/osism/kolla-ansible:8.1.0 "/entrypoint.sh osis…" kolla-ansible About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.322827 | orchestrator | manager-api-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" api About a minute ago Up About a minute (healthy) 192.168.16.5:8000->8000/tcp 2025-03-23 20:33:33.322860 | orchestrator | manager-ara-server-1 registry.osism.tech/osism/ara-server:1.7.2 "sh -c '/wait && /ru…" ara-server About a minute ago Up About a minute (healthy) 8000/tcp 2025-03-23 20:33:33.322876 | orchestrator | manager-beat-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" beat About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.322895 | orchestrator | manager-conductor-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" conductor About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.322909 | orchestrator | manager-flower-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" flower About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.322923 | orchestrator | manager-inventory_reconciler-1 registry.osism.tech/osism/inventory-reconciler:8.1.0 "/sbin/tini -- /entr…" inventory_reconciler About a minute ago Up 51 seconds (healthy) 2025-03-23 20:33:33.322937 | orchestrator | manager-listener-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" listener About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.322951 | orchestrator | manager-mariadb-1 index.docker.io/library/mariadb:11.6.2 "docker-entrypoint.s…" mariadb About a minute ago Up About a minute (healthy) 3306/tcp 2025-03-23 20:33:33.322964 | orchestrator | manager-netbox-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" netbox About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.322978 | orchestrator | manager-openstack-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" openstack About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.322992 | orchestrator | manager-redis-1 index.docker.io/library/redis:7.4.1-alpine "docker-entrypoint.s…" redis About a minute ago Up About a minute (healthy) 6379/tcp 2025-03-23 20:33:33.323031 | orchestrator | manager-watchdog-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" watchdog About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.323046 | orchestrator | osism-ansible registry.osism.tech/osism/osism-ansible:8.1.0 "/entrypoint.sh osis…" osism-ansible About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.323059 | orchestrator | osism-kubernetes registry.osism.tech/osism/osism-kubernetes:8.1.0 "/entrypoint.sh osis…" osism-kubernetes About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.323073 | orchestrator | osismclient registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- sl…" osismclient About a minute ago Up About a minute (healthy) 2025-03-23 20:33:33.323095 | orchestrator | + docker compose --project-directory /opt/netbox ps 2025-03-23 20:33:33.549324 | orchestrator | NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS 2025-03-23 20:33:33.555608 | orchestrator | netbox-netbox-1 registry.osism.tech/osism/netbox:v4.1.7 "/usr/bin/tini -- /o…" netbox 9 minutes ago Up 8 minutes (healthy) 2025-03-23 20:33:33.555674 | orchestrator | netbox-netbox-worker-1 registry.osism.tech/osism/netbox:v4.1.7 "/opt/netbox/venv/bi…" netbox-worker 9 minutes ago Up 3 minutes (healthy) 2025-03-23 20:33:33.555700 | orchestrator | netbox-postgres-1 index.docker.io/library/postgres:16.6-alpine "docker-entrypoint.s…" postgres 9 minutes ago Up 8 minutes (healthy) 5432/tcp 2025-03-23 20:33:33.555723 | orchestrator | netbox-redis-1 index.docker.io/library/redis:7.4.2-alpine "docker-entrypoint.s…" redis 9 minutes ago Up 8 minutes (healthy) 6379/tcp 2025-03-23 20:33:33.555748 | orchestrator | ++ semver 8.1.0 7.0.0 2025-03-23 20:33:33.615962 | orchestrator | + [[ 1 -ge 0 ]] 2025-03-23 20:33:33.620883 | orchestrator | + sed -i s/community.general.yaml/osism.commons.still_alive/ /opt/configuration/environments/ansible.cfg 2025-03-23 20:33:33.620948 | orchestrator | + osism apply resolvconf -l testbed-manager 2025-03-23 20:33:35.494749 | orchestrator | 2025-03-23 20:33:35 | INFO  | Task a894930b-0b2e-4a1c-8714-3cdbd4dfec71 (resolvconf) was prepared for execution. 2025-03-23 20:33:38.941349 | orchestrator | 2025-03-23 20:33:35 | INFO  | It takes a moment until task a894930b-0b2e-4a1c-8714-3cdbd4dfec71 (resolvconf) has been started and output is visible here. 2025-03-23 20:33:38.941502 | orchestrator | 2025-03-23 20:33:38.945991 | orchestrator | PLAY [Apply role resolvconf] *************************************************** 2025-03-23 20:33:38.946134 | orchestrator | 2025-03-23 20:33:38.948272 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-23 20:33:38.948353 | orchestrator | Sunday 23 March 2025 20:33:38 +0000 (0:00:00.101) 0:00:00.101 ********** 2025-03-23 20:33:43.790963 | orchestrator | ok: [testbed-manager] 2025-03-23 20:33:43.846195 | orchestrator | 2025-03-23 20:33:43.846245 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2025-03-23 20:33:43.846263 | orchestrator | Sunday 23 March 2025 20:33:43 +0000 (0:00:04.854) 0:00:04.956 ********** 2025-03-23 20:33:43.846287 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:33:43.846964 | orchestrator | 2025-03-23 20:33:43.847392 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2025-03-23 20:33:43.848169 | orchestrator | Sunday 23 March 2025 20:33:43 +0000 (0:00:00.056) 0:00:05.013 ********** 2025-03-23 20:33:43.962469 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager 2025-03-23 20:33:43.962878 | orchestrator | 2025-03-23 20:33:43.964924 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2025-03-23 20:33:43.965809 | orchestrator | Sunday 23 March 2025 20:33:43 +0000 (0:00:00.116) 0:00:05.129 ********** 2025-03-23 20:33:44.046559 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager 2025-03-23 20:33:44.047350 | orchestrator | 2025-03-23 20:33:44.047391 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2025-03-23 20:33:44.047847 | orchestrator | Sunday 23 March 2025 20:33:44 +0000 (0:00:00.083) 0:00:05.213 ********** 2025-03-23 20:33:45.367514 | orchestrator | ok: [testbed-manager] 2025-03-23 20:33:45.367787 | orchestrator | 2025-03-23 20:33:45.368575 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2025-03-23 20:33:45.369522 | orchestrator | Sunday 23 March 2025 20:33:45 +0000 (0:00:01.318) 0:00:06.531 ********** 2025-03-23 20:33:45.432136 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:33:45.432331 | orchestrator | 2025-03-23 20:33:45.432886 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2025-03-23 20:33:45.433741 | orchestrator | Sunday 23 March 2025 20:33:45 +0000 (0:00:00.068) 0:00:06.600 ********** 2025-03-23 20:33:45.899022 | orchestrator | ok: [testbed-manager] 2025-03-23 20:33:45.900187 | orchestrator | 2025-03-23 20:33:45.900946 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2025-03-23 20:33:45.901779 | orchestrator | Sunday 23 March 2025 20:33:45 +0000 (0:00:00.466) 0:00:07.066 ********** 2025-03-23 20:33:45.985834 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:33:45.986853 | orchestrator | 2025-03-23 20:33:45.987456 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2025-03-23 20:33:45.987913 | orchestrator | Sunday 23 March 2025 20:33:45 +0000 (0:00:00.086) 0:00:07.153 ********** 2025-03-23 20:33:46.515789 | orchestrator | changed: [testbed-manager] 2025-03-23 20:33:46.516001 | orchestrator | 2025-03-23 20:33:46.516027 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2025-03-23 20:33:46.516048 | orchestrator | Sunday 23 March 2025 20:33:46 +0000 (0:00:00.530) 0:00:07.683 ********** 2025-03-23 20:33:47.540578 | orchestrator | changed: [testbed-manager] 2025-03-23 20:33:47.541321 | orchestrator | 2025-03-23 20:33:47.542305 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2025-03-23 20:33:47.543218 | orchestrator | Sunday 23 March 2025 20:33:47 +0000 (0:00:01.023) 0:00:08.706 ********** 2025-03-23 20:33:48.491131 | orchestrator | ok: [testbed-manager] 2025-03-23 20:33:48.491299 | orchestrator | 2025-03-23 20:33:48.491318 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2025-03-23 20:33:48.491335 | orchestrator | Sunday 23 March 2025 20:33:48 +0000 (0:00:00.948) 0:00:09.655 ********** 2025-03-23 20:33:48.578563 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager 2025-03-23 20:33:48.580254 | orchestrator | 2025-03-23 20:33:48.580379 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2025-03-23 20:33:48.580746 | orchestrator | Sunday 23 March 2025 20:33:48 +0000 (0:00:00.090) 0:00:09.745 ********** 2025-03-23 20:33:49.846513 | orchestrator | changed: [testbed-manager] 2025-03-23 20:33:49.847029 | orchestrator | 2025-03-23 20:33:49.850418 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:33:49.851613 | orchestrator | 2025-03-23 20:33:49 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:33:49.851641 | orchestrator | 2025-03-23 20:33:49 | INFO  | Please wait and do not abort execution. 2025-03-23 20:33:49.851662 | orchestrator | testbed-manager : ok=10  changed=3  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 20:33:49.853514 | orchestrator | 2025-03-23 20:33:49.853722 | orchestrator | Sunday 23 March 2025 20:33:49 +0000 (0:00:01.266) 0:00:11.012 ********** 2025-03-23 20:33:49.854773 | orchestrator | =============================================================================== 2025-03-23 20:33:49.855775 | orchestrator | Gathering Facts --------------------------------------------------------- 4.86s 2025-03-23 20:33:49.856099 | orchestrator | osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf --- 1.32s 2025-03-23 20:33:49.856673 | orchestrator | osism.commons.resolvconf : Restart systemd-resolved service ------------- 1.27s 2025-03-23 20:33:49.857355 | orchestrator | osism.commons.resolvconf : Copy configuration files --------------------- 1.02s 2025-03-23 20:33:49.857645 | orchestrator | osism.commons.resolvconf : Start/enable systemd-resolved service -------- 0.95s 2025-03-23 20:33:49.859729 | orchestrator | osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf --- 0.53s 2025-03-23 20:33:49.860328 | orchestrator | osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf ----- 0.47s 2025-03-23 20:33:49.861039 | orchestrator | osism.commons.resolvconf : Include resolvconf tasks --------------------- 0.12s 2025-03-23 20:33:49.861246 | orchestrator | osism.commons.resolvconf : Include distribution specific configuration tasks --- 0.09s 2025-03-23 20:33:49.862157 | orchestrator | osism.commons.resolvconf : Archive existing file /etc/resolv.conf ------- 0.09s 2025-03-23 20:33:49.863437 | orchestrator | osism.commons.resolvconf : Include distribution specific installation tasks --- 0.08s 2025-03-23 20:33:49.864646 | orchestrator | osism.commons.resolvconf : Install package systemd-resolved ------------- 0.07s 2025-03-23 20:33:49.865426 | orchestrator | osism.commons.resolvconf : Check minimum and maximum number of name servers --- 0.06s 2025-03-23 20:33:50.292336 | orchestrator | + osism apply sshconfig 2025-03-23 20:33:51.880500 | orchestrator | 2025-03-23 20:33:51 | INFO  | Task 788cc59d-7cda-48e8-9e2d-dde8702be06e (sshconfig) was prepared for execution. 2025-03-23 20:33:55.090228 | orchestrator | 2025-03-23 20:33:51 | INFO  | It takes a moment until task 788cc59d-7cda-48e8-9e2d-dde8702be06e (sshconfig) has been started and output is visible here. 2025-03-23 20:33:55.090382 | orchestrator | 2025-03-23 20:33:55.092094 | orchestrator | PLAY [Apply role sshconfig] **************************************************** 2025-03-23 20:33:55.092147 | orchestrator | 2025-03-23 20:33:55.092169 | orchestrator | TASK [osism.commons.sshconfig : Get home directory of operator user] *********** 2025-03-23 20:33:55.094081 | orchestrator | Sunday 23 March 2025 20:33:55 +0000 (0:00:00.121) 0:00:00.121 ********** 2025-03-23 20:33:55.697286 | orchestrator | ok: [testbed-manager] 2025-03-23 20:33:55.697727 | orchestrator | 2025-03-23 20:33:55.697759 | orchestrator | TASK [osism.commons.sshconfig : Ensure .ssh/config.d exist] ******************** 2025-03-23 20:33:55.697782 | orchestrator | Sunday 23 March 2025 20:33:55 +0000 (0:00:00.611) 0:00:00.732 ********** 2025-03-23 20:33:56.224391 | orchestrator | changed: [testbed-manager] 2025-03-23 20:33:56.224913 | orchestrator | 2025-03-23 20:33:56.225286 | orchestrator | TASK [osism.commons.sshconfig : Ensure config for each host exist] ************* 2025-03-23 20:33:56.225694 | orchestrator | Sunday 23 March 2025 20:33:56 +0000 (0:00:00.525) 0:00:01.258 ********** 2025-03-23 20:34:02.530566 | orchestrator | changed: [testbed-manager] => (item=testbed-manager) 2025-03-23 20:34:02.530850 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3) 2025-03-23 20:34:02.531805 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4) 2025-03-23 20:34:02.531842 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5) 2025-03-23 20:34:02.532701 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0) 2025-03-23 20:34:02.532936 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1) 2025-03-23 20:34:02.533213 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2) 2025-03-23 20:34:02.533596 | orchestrator | 2025-03-23 20:34:02.534771 | orchestrator | TASK [osism.commons.sshconfig : Add extra config] ****************************** 2025-03-23 20:34:02.536847 | orchestrator | Sunday 23 March 2025 20:34:02 +0000 (0:00:06.302) 0:00:07.561 ********** 2025-03-23 20:34:02.630471 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:34:02.631237 | orchestrator | 2025-03-23 20:34:02.631268 | orchestrator | TASK [osism.commons.sshconfig : Assemble ssh config] *************************** 2025-03-23 20:34:02.631291 | orchestrator | Sunday 23 March 2025 20:34:02 +0000 (0:00:00.104) 0:00:07.665 ********** 2025-03-23 20:34:03.218997 | orchestrator | changed: [testbed-manager] 2025-03-23 20:34:03.220245 | orchestrator | 2025-03-23 20:34:03.221967 | orchestrator | 2025-03-23 20:34:03 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:34:03.222197 | orchestrator | 2025-03-23 20:34:03 | INFO  | Please wait and do not abort execution. 2025-03-23 20:34:03.222230 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:34:03.223199 | orchestrator | testbed-manager : ok=4  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:34:03.223233 | orchestrator | 2025-03-23 20:34:03.224358 | orchestrator | Sunday 23 March 2025 20:34:03 +0000 (0:00:00.589) 0:00:08.254 ********** 2025-03-23 20:34:03.224762 | orchestrator | =============================================================================== 2025-03-23 20:34:03.225560 | orchestrator | osism.commons.sshconfig : Ensure config for each host exist ------------- 6.30s 2025-03-23 20:34:03.226096 | orchestrator | osism.commons.sshconfig : Get home directory of operator user ----------- 0.61s 2025-03-23 20:34:03.226883 | orchestrator | osism.commons.sshconfig : Assemble ssh config --------------------------- 0.59s 2025-03-23 20:34:03.227432 | orchestrator | osism.commons.sshconfig : Ensure .ssh/config.d exist -------------------- 0.53s 2025-03-23 20:34:03.228062 | orchestrator | osism.commons.sshconfig : Add extra config ------------------------------ 0.10s 2025-03-23 20:34:03.680731 | orchestrator | + osism apply known-hosts 2025-03-23 20:34:05.198859 | orchestrator | 2025-03-23 20:34:05 | INFO  | Task 25cefb7b-890d-459b-9e20-b05e4eb92917 (known-hosts) was prepared for execution. 2025-03-23 20:34:08.522966 | orchestrator | 2025-03-23 20:34:05 | INFO  | It takes a moment until task 25cefb7b-890d-459b-9e20-b05e4eb92917 (known-hosts) has been started and output is visible here. 2025-03-23 20:34:08.523104 | orchestrator | 2025-03-23 20:34:08.523267 | orchestrator | PLAY [Apply role known_hosts] ************************************************** 2025-03-23 20:34:08.523822 | orchestrator | 2025-03-23 20:34:08.524415 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname] *** 2025-03-23 20:34:08.526561 | orchestrator | Sunday 23 March 2025 20:34:08 +0000 (0:00:00.124) 0:00:00.124 ********** 2025-03-23 20:34:14.783445 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2025-03-23 20:34:14.783677 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2025-03-23 20:34:14.783706 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2025-03-23 20:34:14.783728 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2025-03-23 20:34:14.784788 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2025-03-23 20:34:14.785596 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2025-03-23 20:34:14.785628 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2025-03-23 20:34:14.785911 | orchestrator | 2025-03-23 20:34:14.786259 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname] *** 2025-03-23 20:34:14.786876 | orchestrator | Sunday 23 March 2025 20:34:14 +0000 (0:00:06.267) 0:00:06.391 ********** 2025-03-23 20:34:14.969512 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2025-03-23 20:34:14.970195 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2025-03-23 20:34:14.972762 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2025-03-23 20:34:14.973220 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2025-03-23 20:34:14.973281 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2025-03-23 20:34:14.973303 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2025-03-23 20:34:14.973974 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2025-03-23 20:34:14.974882 | orchestrator | 2025-03-23 20:34:14.975043 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:14.975754 | orchestrator | Sunday 23 March 2025 20:34:14 +0000 (0:00:00.186) 0:00:06.578 ********** 2025-03-23 20:34:16.237198 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCt2cBBdgtDsIH7Fsbk/6pc76IqfulSiGH3e01T4S6o0rgcgYblgoQzM11kdfUwm/wrt5iIMP2BFJEAUc1hcwdc6DKk33tKtVxS0uvNYtDq0SKj7L8GWAMGbUddpxnY1ovO0mzZcYhyXTh5sT2d2dL5GV7hWjpwXHlQvgEgVs8jWnCvv+OmDg2Z45noNfyQ73pfzpPR6wShACQUoc//uX8XRS73GXsD2auJ+UppLZXlL4EWYsz2F7QqNYZeJbyYUzniXLCvVRm8PHF2Bk87YvpBuHY3KNaE0NaNbWpFOqgi6PtdjKYQPooNeE91tgKNcajziWOhnWc8LclRx8mgbeyEHpT0mT85b76CViYJG1nydrvjtfquXRpsgv+f7mXscF5eycSZ31lg2LsQZAkOFts+IWHh2Kq/sBI+eVSDJhy9iHXSbpmdKwi0G50PmqDOK1WjD8z+fzXsreDugWMRyblngk0WGiB4bc4A/jevWU5dZf1a1MlzyIknG85LD7jhlIM=) 2025-03-23 20:34:16.238591 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNPBJQXGMJWIxYaASO9AouOBKL6Lrjc0xgafBi5vEW8653lCTT9YQEbBuJIbq3KUw7EQYl/YKaId6MhrGKYiVtI=) 2025-03-23 20:34:16.238638 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJA5BCk6VvqcAu1E4oXNgbXRf1RREtiyQfcfFGpN6Pmw) 2025-03-23 20:34:16.239820 | orchestrator | 2025-03-23 20:34:16.240838 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:16.240864 | orchestrator | Sunday 23 March 2025 20:34:16 +0000 (0:00:01.265) 0:00:07.844 ********** 2025-03-23 20:34:17.365699 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDfp9XwmRcOgIT/F2v4agyhk7BO1Wq0nIT6MdkYAUy+vJ8ioH2tDymkmc5vNytgZajTIR/WVV167+S+OFU5tm/egLa/s78FyX0WX81FLQ3UVXv835CFXuYhIBLiyqrG/OARjLuSJ2meR3cn6VU8E0m6IaZUDUs9cHIarh2t+PlqKDaUtAPrI3+FgsBCekjS3JLgyX13tCuXp2y4dFUNNptzOFb2wZX9ZH+/uQtPyjOphU/emSRfsv3C1psLUrqDGD2tPbdsi5u8Be8MIGvbFf+j/QDm68POuvf31p05UyYzjMAGlG7dWeE+grQj1rcsxQLBBde9I6xJx7+WLsl25xzzCIRQyZp99I7VMmUUDbmZxolsvMMPfuarN6FXWFatXbev15oyfRXKFxzqXqX8K8RrlW5ELJfXl69s4QRm8G7QCzwpnJac4DYafQPp3sAdl0ELL6W1UW+Wkkh/G7tA2XJx0dNpy6GDTrchYscMMotK41RzDNWw+s//uRK+c9doeM8=) 2025-03-23 20:34:17.366956 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHhL7pbu+PT3YQ9klfMFsfCjr0DPezHNjjsumg9obLOBvQUcbUlDQfVmFBxeW28AZMMeeW2khGD8xMbuYFGFdsk=) 2025-03-23 20:34:17.366996 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIyxzq8ZQfNrhDZIeUjMao/niSr6MdD5nQqHglRhEoV/) 2025-03-23 20:34:17.367086 | orchestrator | 2025-03-23 20:34:18.609442 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:18.609532 | orchestrator | Sunday 23 March 2025 20:34:17 +0000 (0:00:01.129) 0:00:08.973 ********** 2025-03-23 20:34:18.609554 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDHBNb4MO1vgge2sl2cPfsACoxoy13+c94N7ldjo91ygwsqtLnJueMYVTZbuU/li5QIFwkSOE/7s7ekq5lwtR72hFl/00i8/dba9I9uuR1k5OI15FjELxfjQ7Ej4Jt41cLikfud8J2a2y457NuSCzaXk1BwLvs/SX5jvYPcr82/mLTIVgpdgfY8lRygT3Z+EEE2AWsMMj5cn6qsvqhMoyBMX33K1iRjmQtIjyYH+Sy4Qmp07OXM6PpE1XX2m0ejr4V9tKd8LyYNUfKk5ysuYMPQ4aNpBY1El/2iNYdFe5IT4RWY6ES15PnWdapdx5eImoYuQL0RX9UHlrq7nz47cC+0uCrCNk4FMggxC+I+2R83Ryms45tf/db1wyCVaZ4djb9QBTYtpkor66ov0aHtLTHntV8bX5c5CAIVBE5rXcOE65lUGDc3uhN379jMqCRThU7LhccxMKWYpzPHJJ9gUQOlmNTv8cihXDx1DGnF3FtspQgkweoXObTOw4UeMU4yzl8=) 2025-03-23 20:34:18.610856 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDrDEhxiOU5NfaUKPB2j9o57ZGc3VuUczSUyJ9muvd+5t6BqC88pnLTN+IYcWofj4Qon4jpi2/O3nuD2onbqYDk=) 2025-03-23 20:34:18.611251 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMX3AGANmYMFJAmNRT37U2pa1CbNiKUzBpS69vRbkm5r) 2025-03-23 20:34:18.613398 | orchestrator | 2025-03-23 20:34:18.614134 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:18.614503 | orchestrator | Sunday 23 March 2025 20:34:18 +0000 (0:00:01.244) 0:00:10.217 ********** 2025-03-23 20:34:19.849844 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4qyu1ogRUu+7kVmiv4lx7pF8GWjIZN8S+j+0MHJE6MG6PjPWWH0yxudpgdqb6+X52ILRyT24pwuLptCRX3Gi1sgfDRzsuvYeTuKU2G702R4in6QsJ/8Ah3F5s27NZhbTdFgPVyY2AokU+zWbDRMHhJKfFf3fznSetdNbvnbgP1MSDZvHR+yGmEWDXO7kW/dlQ4SCYbXu3TRfL8wfERiQF3+O+dSNZubrAZaxrEpJ75Twa8SsRIjjg0BdiW7cy4DVzgDlv7sTyluzDgmVZ34qmloQjnL/mqau6KPJVhgQicltgo+Tq0KPQIGM8g4j9HijQuQlcnig3n18ayIhEZ3+NTldc6CDTRvcSkIBynPo87b0Ap+rsXMr5a/pBbKaLI2BfbdWQJOwfpWRkWN7KRj8Y3PdQ04X3DCE2r87+1lDB6z83sVcd2kcefa6o0MG37Czr1KQKywKidUhaQuIyUNXK91xJH3BVtCK9YkwwUKXvnM2Op5J6ed1URdkJH6IV8E8=) 2025-03-23 20:34:21.015325 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDlCjvEI1GE8fM/Mu6Wn4q/H5J3Am87JkxJgRJJq4gTFhftYNeUBAlqK/xX3XnQFc4FrbSsJvr0TiRLnvni8awE=) 2025-03-23 20:34:21.015484 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPcMZ5KAqgdLIcZ2RThXUBtiHmtI53VEQxYeK1hmQH3R) 2025-03-23 20:34:21.015508 | orchestrator | 2025-03-23 20:34:21.015521 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:21.015532 | orchestrator | Sunday 23 March 2025 20:34:19 +0000 (0:00:01.237) 0:00:11.455 ********** 2025-03-23 20:34:21.015560 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrRSXFT2ej+HZ3Z5pMfEAYWRlt2MNUqx//k8FEZkEUN8EAiPazqqOfS2Sf/OcBmzsVxIQKhSbaxQO6Xu11g6zUOk4sguAFt4iqhgBpxGCRWRNsF9GRiIt8vFDxOJl8xGO7HzQSdhbYgVO9XnfjQXQJwBLqGdo9F1p5Nm7iLfzAwWjVf7BB/HLIQ4EN2UuYGNq9V2/itV0mrFhhM2sJ0MgVoaFXu0ZoHNywS4b0SF2aDd2n3iYdPqmT8DDlHlY9oTAc+7JFPxYvgHM5kEXr0iBHI6MpFF4S4liH4N0A/QqpGl7mbOhajG8M5SIZRIfL2kRDibIZ2MG61QgBahZIUFHu6nCEcg+P1tvXz+uhPg+iFAmNhF7GBM9XYqy9WCXprIi6+6N6lKmZGx6GlHFY3B38lWHEhZ0pTFHs8on56LkXSSgG/rZzU6WQ4PlLkXC2aRM2H43P3rpDb5F2g5MfAXe+gFec4bpS3EbLZjwl5h2xmtLVuJFNTeDDf6SHhA88vUM=) 2025-03-23 20:34:21.016015 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBnkSlreXhfH6VEi4dtTLJ5nAgOjc0etgIkEk40uL5F3dlSuaDjJET9nQbj6Iw939bAArswEBCTTK12nx4UaQ78=) 2025-03-23 20:34:21.016035 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIChtkGs0YuZ1V38FASE596lSb0VHUat5Z37k1RMr/5+w) 2025-03-23 20:34:21.016049 | orchestrator | 2025-03-23 20:34:21.016981 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:22.166798 | orchestrator | Sunday 23 March 2025 20:34:21 +0000 (0:00:01.167) 0:00:12.623 ********** 2025-03-23 20:34:22.167016 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCffcBMD60l8HGZbmDr7ZXeKakY0ep+rFeaRUYzCMhXLtvUY+GeaNkRczY0oh7tqvDOQ/n1jBlSBr+dCVncvlnt4vg0MyEw3I1JYpIxY08/UwW+0rhn5IsZKqtmE/kWkNbPB6fZo6Wt+K/Uc+B5TSjQfzHFvaVa1Bncfso2EgbujwFp5Kzs0o7UZ6kIr3dDa9s9zj0L6P1kyM3IxfTKse+k1g82fP+G0tDdXct73bPvNG11a6j+vCWjXqq5EFf8JJ6hK5kr2mnNHQzwpnGrn8ZFzcYR5xSiZy0EtlArX7wEx7ljbitfyTglg4Y/lExgXoYqq++DJ9TbZSgBdzx3SAJh+QatQzhL1homArdxZBxx48l82zd7DIZd1IB0uq6quVbHFm2SDeKQG31jTiYjrJp/P+n1QR0+MQwmY0cDuBsNBVQO4NuF+weX9FZ5GaL43isFZDmYgg+v4JSNQ6dmDDstmrzf1cGykr0SoDqHd8YuMjicogQI5wnncl0Nvh9b/7U=) 2025-03-23 20:34:22.167444 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVV44e+EFboep3IODL4s74szI3MSS31x2kEB1LKXMiDDQTJNtPDmR8k6RIa0kvrr0ieeCTz1wJrLGomgP9zdWA=) 2025-03-23 20:34:22.167505 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKjFC1hAkyA6UEmXfvovGzpZZWJZVtBXxIm+MJykO9qp) 2025-03-23 20:34:22.167616 | orchestrator | 2025-03-23 20:34:22.168755 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:22.168785 | orchestrator | Sunday 23 March 2025 20:34:22 +0000 (0:00:01.151) 0:00:13.775 ********** 2025-03-23 20:34:23.422968 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDPNTKAGh4uBsEhAmnmhTMpypCqJILpBd1Cs5f05a57OvFCIN/MZbk2zViZUMN4bxvBHYu2Ql0B8IkIDw2CB1KkGE9TY2hs8KhlwQ4AUJ6Nx0r3U5Znec1VGS0cYiFBCPe1uUPGjDjAYKivJDk+4YWpIx1zNKqBV5V6hZQtc9iYgXsoxejI+yQvYYcS3Szib9WuYzdjrvwq++tTJsDCdZHE5nVyvDuU9eVwYItIQ9Z/2XCQdPn2dYOzBsaI1U+ZvIjzsDkUtfrDYJ6Gxaeh0XdjpywhsEDaK5Qo/kU2MFnSazJecdrnYkKj9cv4LjasyQxuGIrJJDbf/19leZEqXdqCOEX5z9UehYXF9a+9+Yq1xQi3eWhjbm7vDGAVGK4uSYsQAqqpR4k2OzV6e/q5nYqKgxaJY8EOglE3AcJnf8/i0eXdNgOJ2BetZ2l5flEQh8a8bt7X9bDcxD1olYfHUeTDkAUiNP1brPl/B/rcVcAgyohTbel3R8VnRY8+jx0KscU=) 2025-03-23 20:34:23.423339 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGZCAhRJzWZb36Fuftm5XHmGrE8FsNXJ4d2EdaYZy2H9) 2025-03-23 20:34:23.423372 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIrJK6YYCOFqWrWjiZPyob07dS/96FpyXcq612jRFOLB/hJpVHvVx8207NWZ+MtxLYLvTso6h8UlMQlwwsipvDY=) 2025-03-23 20:34:23.423394 | orchestrator | 2025-03-23 20:34:23.424255 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host] *** 2025-03-23 20:34:23.424779 | orchestrator | Sunday 23 March 2025 20:34:23 +0000 (0:00:01.256) 0:00:15.031 ********** 2025-03-23 20:34:28.736730 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2025-03-23 20:34:28.737115 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2025-03-23 20:34:28.737163 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2025-03-23 20:34:28.737184 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2025-03-23 20:34:28.737287 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2025-03-23 20:34:28.738189 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2025-03-23 20:34:28.738340 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2025-03-23 20:34:28.738367 | orchestrator | 2025-03-23 20:34:28.738973 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host] *** 2025-03-23 20:34:28.739504 | orchestrator | Sunday 23 March 2025 20:34:28 +0000 (0:00:05.313) 0:00:20.345 ********** 2025-03-23 20:34:28.910713 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2025-03-23 20:34:28.910993 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2025-03-23 20:34:28.912111 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2025-03-23 20:34:28.913031 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2025-03-23 20:34:28.913864 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2025-03-23 20:34:28.914573 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2025-03-23 20:34:28.915604 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2025-03-23 20:34:28.916307 | orchestrator | 2025-03-23 20:34:28.917225 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:28.917916 | orchestrator | Sunday 23 March 2025 20:34:28 +0000 (0:00:00.173) 0:00:20.519 ********** 2025-03-23 20:34:29.956000 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCt2cBBdgtDsIH7Fsbk/6pc76IqfulSiGH3e01T4S6o0rgcgYblgoQzM11kdfUwm/wrt5iIMP2BFJEAUc1hcwdc6DKk33tKtVxS0uvNYtDq0SKj7L8GWAMGbUddpxnY1ovO0mzZcYhyXTh5sT2d2dL5GV7hWjpwXHlQvgEgVs8jWnCvv+OmDg2Z45noNfyQ73pfzpPR6wShACQUoc//uX8XRS73GXsD2auJ+UppLZXlL4EWYsz2F7QqNYZeJbyYUzniXLCvVRm8PHF2Bk87YvpBuHY3KNaE0NaNbWpFOqgi6PtdjKYQPooNeE91tgKNcajziWOhnWc8LclRx8mgbeyEHpT0mT85b76CViYJG1nydrvjtfquXRpsgv+f7mXscF5eycSZ31lg2LsQZAkOFts+IWHh2Kq/sBI+eVSDJhy9iHXSbpmdKwi0G50PmqDOK1WjD8z+fzXsreDugWMRyblngk0WGiB4bc4A/jevWU5dZf1a1MlzyIknG85LD7jhlIM=) 2025-03-23 20:34:29.956240 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNPBJQXGMJWIxYaASO9AouOBKL6Lrjc0xgafBi5vEW8653lCTT9YQEbBuJIbq3KUw7EQYl/YKaId6MhrGKYiVtI=) 2025-03-23 20:34:29.956278 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJA5BCk6VvqcAu1E4oXNgbXRf1RREtiyQfcfFGpN6Pmw) 2025-03-23 20:34:29.956660 | orchestrator | 2025-03-23 20:34:29.956924 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:29.957175 | orchestrator | Sunday 23 March 2025 20:34:29 +0000 (0:00:01.044) 0:00:21.564 ********** 2025-03-23 20:34:31.171459 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHhL7pbu+PT3YQ9klfMFsfCjr0DPezHNjjsumg9obLOBvQUcbUlDQfVmFBxeW28AZMMeeW2khGD8xMbuYFGFdsk=) 2025-03-23 20:34:31.172003 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDfp9XwmRcOgIT/F2v4agyhk7BO1Wq0nIT6MdkYAUy+vJ8ioH2tDymkmc5vNytgZajTIR/WVV167+S+OFU5tm/egLa/s78FyX0WX81FLQ3UVXv835CFXuYhIBLiyqrG/OARjLuSJ2meR3cn6VU8E0m6IaZUDUs9cHIarh2t+PlqKDaUtAPrI3+FgsBCekjS3JLgyX13tCuXp2y4dFUNNptzOFb2wZX9ZH+/uQtPyjOphU/emSRfsv3C1psLUrqDGD2tPbdsi5u8Be8MIGvbFf+j/QDm68POuvf31p05UyYzjMAGlG7dWeE+grQj1rcsxQLBBde9I6xJx7+WLsl25xzzCIRQyZp99I7VMmUUDbmZxolsvMMPfuarN6FXWFatXbev15oyfRXKFxzqXqX8K8RrlW5ELJfXl69s4QRm8G7QCzwpnJac4DYafQPp3sAdl0ELL6W1UW+Wkkh/G7tA2XJx0dNpy6GDTrchYscMMotK41RzDNWw+s//uRK+c9doeM8=) 2025-03-23 20:34:31.173487 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIyxzq8ZQfNrhDZIeUjMao/niSr6MdD5nQqHglRhEoV/) 2025-03-23 20:34:31.174214 | orchestrator | 2025-03-23 20:34:31.175287 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:31.176550 | orchestrator | Sunday 23 March 2025 20:34:31 +0000 (0:00:01.214) 0:00:22.779 ********** 2025-03-23 20:34:32.389055 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDHBNb4MO1vgge2sl2cPfsACoxoy13+c94N7ldjo91ygwsqtLnJueMYVTZbuU/li5QIFwkSOE/7s7ekq5lwtR72hFl/00i8/dba9I9uuR1k5OI15FjELxfjQ7Ej4Jt41cLikfud8J2a2y457NuSCzaXk1BwLvs/SX5jvYPcr82/mLTIVgpdgfY8lRygT3Z+EEE2AWsMMj5cn6qsvqhMoyBMX33K1iRjmQtIjyYH+Sy4Qmp07OXM6PpE1XX2m0ejr4V9tKd8LyYNUfKk5ysuYMPQ4aNpBY1El/2iNYdFe5IT4RWY6ES15PnWdapdx5eImoYuQL0RX9UHlrq7nz47cC+0uCrCNk4FMggxC+I+2R83Ryms45tf/db1wyCVaZ4djb9QBTYtpkor66ov0aHtLTHntV8bX5c5CAIVBE5rXcOE65lUGDc3uhN379jMqCRThU7LhccxMKWYpzPHJJ9gUQOlmNTv8cihXDx1DGnF3FtspQgkweoXObTOw4UeMU4yzl8=) 2025-03-23 20:34:32.389752 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDrDEhxiOU5NfaUKPB2j9o57ZGc3VuUczSUyJ9muvd+5t6BqC88pnLTN+IYcWofj4Qon4jpi2/O3nuD2onbqYDk=) 2025-03-23 20:34:32.390588 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMX3AGANmYMFJAmNRT37U2pa1CbNiKUzBpS69vRbkm5r) 2025-03-23 20:34:32.391931 | orchestrator | 2025-03-23 20:34:32.394806 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:32.395278 | orchestrator | Sunday 23 March 2025 20:34:32 +0000 (0:00:01.217) 0:00:23.996 ********** 2025-03-23 20:34:33.555581 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC4qyu1ogRUu+7kVmiv4lx7pF8GWjIZN8S+j+0MHJE6MG6PjPWWH0yxudpgdqb6+X52ILRyT24pwuLptCRX3Gi1sgfDRzsuvYeTuKU2G702R4in6QsJ/8Ah3F5s27NZhbTdFgPVyY2AokU+zWbDRMHhJKfFf3fznSetdNbvnbgP1MSDZvHR+yGmEWDXO7kW/dlQ4SCYbXu3TRfL8wfERiQF3+O+dSNZubrAZaxrEpJ75Twa8SsRIjjg0BdiW7cy4DVzgDlv7sTyluzDgmVZ34qmloQjnL/mqau6KPJVhgQicltgo+Tq0KPQIGM8g4j9HijQuQlcnig3n18ayIhEZ3+NTldc6CDTRvcSkIBynPo87b0Ap+rsXMr5a/pBbKaLI2BfbdWQJOwfpWRkWN7KRj8Y3PdQ04X3DCE2r87+1lDB6z83sVcd2kcefa6o0MG37Czr1KQKywKidUhaQuIyUNXK91xJH3BVtCK9YkwwUKXvnM2Op5J6ed1URdkJH6IV8E8=) 2025-03-23 20:34:33.556233 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDlCjvEI1GE8fM/Mu6Wn4q/H5J3Am87JkxJgRJJq4gTFhftYNeUBAlqK/xX3XnQFc4FrbSsJvr0TiRLnvni8awE=) 2025-03-23 20:34:33.556758 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPcMZ5KAqgdLIcZ2RThXUBtiHmtI53VEQxYeK1hmQH3R) 2025-03-23 20:34:33.557302 | orchestrator | 2025-03-23 20:34:33.558181 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:33.558562 | orchestrator | Sunday 23 March 2025 20:34:33 +0000 (0:00:01.167) 0:00:25.164 ********** 2025-03-23 20:34:34.780338 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrRSXFT2ej+HZ3Z5pMfEAYWRlt2MNUqx//k8FEZkEUN8EAiPazqqOfS2Sf/OcBmzsVxIQKhSbaxQO6Xu11g6zUOk4sguAFt4iqhgBpxGCRWRNsF9GRiIt8vFDxOJl8xGO7HzQSdhbYgVO9XnfjQXQJwBLqGdo9F1p5Nm7iLfzAwWjVf7BB/HLIQ4EN2UuYGNq9V2/itV0mrFhhM2sJ0MgVoaFXu0ZoHNywS4b0SF2aDd2n3iYdPqmT8DDlHlY9oTAc+7JFPxYvgHM5kEXr0iBHI6MpFF4S4liH4N0A/QqpGl7mbOhajG8M5SIZRIfL2kRDibIZ2MG61QgBahZIUFHu6nCEcg+P1tvXz+uhPg+iFAmNhF7GBM9XYqy9WCXprIi6+6N6lKmZGx6GlHFY3B38lWHEhZ0pTFHs8on56LkXSSgG/rZzU6WQ4PlLkXC2aRM2H43P3rpDb5F2g5MfAXe+gFec4bpS3EbLZjwl5h2xmtLVuJFNTeDDf6SHhA88vUM=) 2025-03-23 20:34:34.781022 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBnkSlreXhfH6VEi4dtTLJ5nAgOjc0etgIkEk40uL5F3dlSuaDjJET9nQbj6Iw939bAArswEBCTTK12nx4UaQ78=) 2025-03-23 20:34:34.781948 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIChtkGs0YuZ1V38FASE596lSb0VHUat5Z37k1RMr/5+w) 2025-03-23 20:34:34.782500 | orchestrator | 2025-03-23 20:34:34.783525 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:34.783896 | orchestrator | Sunday 23 March 2025 20:34:34 +0000 (0:00:01.223) 0:00:26.388 ********** 2025-03-23 20:34:35.939319 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKjFC1hAkyA6UEmXfvovGzpZZWJZVtBXxIm+MJykO9qp) 2025-03-23 20:34:35.939668 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCffcBMD60l8HGZbmDr7ZXeKakY0ep+rFeaRUYzCMhXLtvUY+GeaNkRczY0oh7tqvDOQ/n1jBlSBr+dCVncvlnt4vg0MyEw3I1JYpIxY08/UwW+0rhn5IsZKqtmE/kWkNbPB6fZo6Wt+K/Uc+B5TSjQfzHFvaVa1Bncfso2EgbujwFp5Kzs0o7UZ6kIr3dDa9s9zj0L6P1kyM3IxfTKse+k1g82fP+G0tDdXct73bPvNG11a6j+vCWjXqq5EFf8JJ6hK5kr2mnNHQzwpnGrn8ZFzcYR5xSiZy0EtlArX7wEx7ljbitfyTglg4Y/lExgXoYqq++DJ9TbZSgBdzx3SAJh+QatQzhL1homArdxZBxx48l82zd7DIZd1IB0uq6quVbHFm2SDeKQG31jTiYjrJp/P+n1QR0+MQwmY0cDuBsNBVQO4NuF+weX9FZ5GaL43isFZDmYgg+v4JSNQ6dmDDstmrzf1cGykr0SoDqHd8YuMjicogQI5wnncl0Nvh9b/7U=) 2025-03-23 20:34:35.939970 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVV44e+EFboep3IODL4s74szI3MSS31x2kEB1LKXMiDDQTJNtPDmR8k6RIa0kvrr0ieeCTz1wJrLGomgP9zdWA=) 2025-03-23 20:34:35.940751 | orchestrator | 2025-03-23 20:34:35.941253 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-23 20:34:35.941287 | orchestrator | Sunday 23 March 2025 20:34:35 +0000 (0:00:01.159) 0:00:27.548 ********** 2025-03-23 20:34:37.150575 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDPNTKAGh4uBsEhAmnmhTMpypCqJILpBd1Cs5f05a57OvFCIN/MZbk2zViZUMN4bxvBHYu2Ql0B8IkIDw2CB1KkGE9TY2hs8KhlwQ4AUJ6Nx0r3U5Znec1VGS0cYiFBCPe1uUPGjDjAYKivJDk+4YWpIx1zNKqBV5V6hZQtc9iYgXsoxejI+yQvYYcS3Szib9WuYzdjrvwq++tTJsDCdZHE5nVyvDuU9eVwYItIQ9Z/2XCQdPn2dYOzBsaI1U+ZvIjzsDkUtfrDYJ6Gxaeh0XdjpywhsEDaK5Qo/kU2MFnSazJecdrnYkKj9cv4LjasyQxuGIrJJDbf/19leZEqXdqCOEX5z9UehYXF9a+9+Yq1xQi3eWhjbm7vDGAVGK4uSYsQAqqpR4k2OzV6e/q5nYqKgxaJY8EOglE3AcJnf8/i0eXdNgOJ2BetZ2l5flEQh8a8bt7X9bDcxD1olYfHUeTDkAUiNP1brPl/B/rcVcAgyohTbel3R8VnRY8+jx0KscU=) 2025-03-23 20:34:37.151047 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIrJK6YYCOFqWrWjiZPyob07dS/96FpyXcq612jRFOLB/hJpVHvVx8207NWZ+MtxLYLvTso6h8UlMQlwwsipvDY=) 2025-03-23 20:34:37.151335 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGZCAhRJzWZb36Fuftm5XHmGrE8FsNXJ4d2EdaYZy2H9) 2025-03-23 20:34:37.151753 | orchestrator | 2025-03-23 20:34:37.151783 | orchestrator | TASK [osism.commons.known_hosts : Write static known_hosts entries] ************ 2025-03-23 20:34:37.152105 | orchestrator | Sunday 23 March 2025 20:34:37 +0000 (0:00:01.209) 0:00:28.757 ********** 2025-03-23 20:34:37.322434 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2025-03-23 20:34:37.322850 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2025-03-23 20:34:37.323235 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2025-03-23 20:34:37.323695 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2025-03-23 20:34:37.324422 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2025-03-23 20:34:37.324672 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2025-03-23 20:34:37.324819 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2025-03-23 20:34:37.325716 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:34:37.326062 | orchestrator | 2025-03-23 20:34:37.326096 | orchestrator | TASK [osism.commons.known_hosts : Write extra known_hosts entries] ************* 2025-03-23 20:34:37.326399 | orchestrator | Sunday 23 March 2025 20:34:37 +0000 (0:00:00.174) 0:00:28.932 ********** 2025-03-23 20:34:37.377756 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:34:37.378163 | orchestrator | 2025-03-23 20:34:37.378495 | orchestrator | TASK [osism.commons.known_hosts : Delete known_hosts entries] ****************** 2025-03-23 20:34:37.378904 | orchestrator | Sunday 23 March 2025 20:34:37 +0000 (0:00:00.055) 0:00:28.987 ********** 2025-03-23 20:34:37.438273 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:34:37.440494 | orchestrator | 2025-03-23 20:34:37.440533 | orchestrator | TASK [osism.commons.known_hosts : Set file permissions] ************************ 2025-03-23 20:34:38.340398 | orchestrator | Sunday 23 March 2025 20:34:37 +0000 (0:00:00.060) 0:00:29.048 ********** 2025-03-23 20:34:38.340496 | orchestrator | changed: [testbed-manager] 2025-03-23 20:34:38.340989 | orchestrator | 2025-03-23 20:34:38.341024 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:34:38.342194 | orchestrator | 2025-03-23 20:34:38 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:34:38.342224 | orchestrator | 2025-03-23 20:34:38 | INFO  | Please wait and do not abort execution. 2025-03-23 20:34:38.342245 | orchestrator | testbed-manager : ok=31  changed=15  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 20:34:38.343191 | orchestrator | 2025-03-23 20:34:38.344298 | orchestrator | Sunday 23 March 2025 20:34:38 +0000 (0:00:00.900) 0:00:29.948 ********** 2025-03-23 20:34:38.344327 | orchestrator | =============================================================================== 2025-03-23 20:34:38.344919 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname --- 6.27s 2025-03-23 20:34:38.345706 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host --- 5.31s 2025-03-23 20:34:38.346118 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.27s 2025-03-23 20:34:38.347011 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.26s 2025-03-23 20:34:38.347771 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.24s 2025-03-23 20:34:38.348258 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.24s 2025-03-23 20:34:38.349100 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.22s 2025-03-23 20:34:38.349346 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.22s 2025-03-23 20:34:38.349987 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.21s 2025-03-23 20:34:38.350810 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.21s 2025-03-23 20:34:38.351746 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.17s 2025-03-23 20:34:38.351930 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.17s 2025-03-23 20:34:38.352248 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.16s 2025-03-23 20:34:38.352398 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.15s 2025-03-23 20:34:38.353315 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.13s 2025-03-23 20:34:38.353428 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.04s 2025-03-23 20:34:38.353867 | orchestrator | osism.commons.known_hosts : Set file permissions ------------------------ 0.90s 2025-03-23 20:34:38.354597 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname --- 0.19s 2025-03-23 20:34:38.354644 | orchestrator | osism.commons.known_hosts : Write static known_hosts entries ------------ 0.17s 2025-03-23 20:34:38.355061 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host --- 0.17s 2025-03-23 20:34:38.909642 | orchestrator | + osism apply squid 2025-03-23 20:34:40.639954 | orchestrator | 2025-03-23 20:34:40 | INFO  | Task 2691c2a7-3332-40ee-a9c6-6a1ae2d12577 (squid) was prepared for execution. 2025-03-23 20:34:44.261764 | orchestrator | 2025-03-23 20:34:40 | INFO  | It takes a moment until task 2691c2a7-3332-40ee-a9c6-6a1ae2d12577 (squid) has been started and output is visible here. 2025-03-23 20:34:44.261896 | orchestrator | 2025-03-23 20:34:44.262876 | orchestrator | PLAY [Apply role squid] ******************************************************** 2025-03-23 20:34:44.264518 | orchestrator | 2025-03-23 20:34:44.266098 | orchestrator | TASK [osism.services.squid : Include install tasks] **************************** 2025-03-23 20:34:44.266133 | orchestrator | Sunday 23 March 2025 20:34:44 +0000 (0:00:00.144) 0:00:00.144 ********** 2025-03-23 20:34:44.389026 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/squid/tasks/install-Debian-family.yml for testbed-manager 2025-03-23 20:34:44.390140 | orchestrator | 2025-03-23 20:34:44.390614 | orchestrator | TASK [osism.services.squid : Install required packages] ************************ 2025-03-23 20:34:44.390642 | orchestrator | Sunday 23 March 2025 20:34:44 +0000 (0:00:00.131) 0:00:00.276 ********** 2025-03-23 20:34:46.025661 | orchestrator | ok: [testbed-manager] 2025-03-23 20:34:46.026379 | orchestrator | 2025-03-23 20:34:46.027213 | orchestrator | TASK [osism.services.squid : Create required directories] ********************** 2025-03-23 20:34:46.027779 | orchestrator | Sunday 23 March 2025 20:34:46 +0000 (0:00:01.634) 0:00:01.911 ********** 2025-03-23 20:34:47.312539 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration) 2025-03-23 20:34:47.313310 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration/conf.d) 2025-03-23 20:34:47.314104 | orchestrator | ok: [testbed-manager] => (item=/opt/squid) 2025-03-23 20:34:47.314747 | orchestrator | 2025-03-23 20:34:47.315373 | orchestrator | TASK [osism.services.squid : Copy squid configuration files] ******************* 2025-03-23 20:34:47.316026 | orchestrator | Sunday 23 March 2025 20:34:47 +0000 (0:00:01.287) 0:00:03.198 ********** 2025-03-23 20:34:48.501037 | orchestrator | changed: [testbed-manager] => (item=osism.conf) 2025-03-23 20:34:48.501249 | orchestrator | 2025-03-23 20:34:48.502175 | orchestrator | TASK [osism.services.squid : Remove osism_allow_list.conf configuration file] *** 2025-03-23 20:34:48.502452 | orchestrator | Sunday 23 March 2025 20:34:48 +0000 (0:00:01.185) 0:00:04.384 ********** 2025-03-23 20:34:48.909612 | orchestrator | ok: [testbed-manager] 2025-03-23 20:34:48.910331 | orchestrator | 2025-03-23 20:34:48.911715 | orchestrator | TASK [osism.services.squid : Copy docker-compose.yml file] ********************* 2025-03-23 20:34:48.912748 | orchestrator | Sunday 23 March 2025 20:34:48 +0000 (0:00:00.409) 0:00:04.794 ********** 2025-03-23 20:34:49.985361 | orchestrator | changed: [testbed-manager] 2025-03-23 20:34:49.985702 | orchestrator | 2025-03-23 20:34:49.987542 | orchestrator | TASK [osism.services.squid : Manage squid service] ***************************** 2025-03-23 20:35:18.217392 | orchestrator | Sunday 23 March 2025 20:34:49 +0000 (0:00:01.076) 0:00:05.871 ********** 2025-03-23 20:35:18.217536 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage squid service (10 retries left). 2025-03-23 20:35:30.605688 | orchestrator | ok: [testbed-manager] 2025-03-23 20:35:30.605847 | orchestrator | 2025-03-23 20:35:30.605869 | orchestrator | RUNNING HANDLER [osism.services.squid : Restart squid service] ***************** 2025-03-23 20:35:30.605885 | orchestrator | Sunday 23 March 2025 20:35:18 +0000 (0:00:28.230) 0:00:34.101 ********** 2025-03-23 20:35:30.605958 | orchestrator | changed: [testbed-manager] 2025-03-23 20:35:30.607123 | orchestrator | 2025-03-23 20:35:30.608946 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for squid service to start] ******* 2025-03-23 20:35:30.609896 | orchestrator | Sunday 23 March 2025 20:35:30 +0000 (0:00:12.386) 0:00:46.488 ********** 2025-03-23 20:36:30.691479 | orchestrator | Pausing for 60 seconds 2025-03-23 20:36:30.692369 | orchestrator | changed: [testbed-manager] 2025-03-23 20:36:30.692410 | orchestrator | 2025-03-23 20:36:30.692427 | orchestrator | RUNNING HANDLER [osism.services.squid : Register that squid service was restarted] *** 2025-03-23 20:36:30.692450 | orchestrator | Sunday 23 March 2025 20:36:30 +0000 (0:01:00.085) 0:01:46.573 ********** 2025-03-23 20:36:30.767461 | orchestrator | ok: [testbed-manager] 2025-03-23 20:36:30.767584 | orchestrator | 2025-03-23 20:36:30.768419 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for an healthy squid service] ***** 2025-03-23 20:36:30.769111 | orchestrator | Sunday 23 March 2025 20:36:30 +0000 (0:00:00.079) 0:01:46.653 ********** 2025-03-23 20:36:31.503212 | orchestrator | changed: [testbed-manager] 2025-03-23 20:36:31.503815 | orchestrator | 2025-03-23 20:36:31.505388 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:36:31.506067 | orchestrator | 2025-03-23 20:36:31 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:36:31.506356 | orchestrator | 2025-03-23 20:36:31 | INFO  | Please wait and do not abort execution. 2025-03-23 20:36:31.508027 | orchestrator | testbed-manager : ok=11  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:36:31.510290 | orchestrator | 2025-03-23 20:36:31.510765 | orchestrator | Sunday 23 March 2025 20:36:31 +0000 (0:00:00.734) 0:01:47.388 ********** 2025-03-23 20:36:31.511548 | orchestrator | =============================================================================== 2025-03-23 20:36:31.512399 | orchestrator | osism.services.squid : Wait for squid service to start ----------------- 60.09s 2025-03-23 20:36:31.513285 | orchestrator | osism.services.squid : Manage squid service ---------------------------- 28.23s 2025-03-23 20:36:31.514473 | orchestrator | osism.services.squid : Restart squid service --------------------------- 12.39s 2025-03-23 20:36:31.514707 | orchestrator | osism.services.squid : Install required packages ------------------------ 1.63s 2025-03-23 20:36:31.515441 | orchestrator | osism.services.squid : Create required directories ---------------------- 1.29s 2025-03-23 20:36:31.515802 | orchestrator | osism.services.squid : Copy squid configuration files ------------------- 1.19s 2025-03-23 20:36:31.516275 | orchestrator | osism.services.squid : Copy docker-compose.yml file --------------------- 1.08s 2025-03-23 20:36:31.516725 | orchestrator | osism.services.squid : Wait for an healthy squid service ---------------- 0.74s 2025-03-23 20:36:31.517285 | orchestrator | osism.services.squid : Remove osism_allow_list.conf configuration file --- 0.41s 2025-03-23 20:36:31.518177 | orchestrator | osism.services.squid : Include install tasks ---------------------------- 0.13s 2025-03-23 20:36:31.518533 | orchestrator | osism.services.squid : Register that squid service was restarted -------- 0.08s 2025-03-23 20:36:31.999429 | orchestrator | + [[ 8.1.0 != \l\a\t\e\s\t ]] 2025-03-23 20:36:32.008345 | orchestrator | + sed -i 's#docker_namespace: kolla#docker_namespace: kolla/release#' /opt/configuration/inventory/group_vars/all/kolla.yml 2025-03-23 20:36:32.008431 | orchestrator | ++ semver 8.1.0 9.0.0 2025-03-23 20:36:32.072304 | orchestrator | + [[ -1 -lt 0 ]] 2025-03-23 20:36:32.077211 | orchestrator | + [[ 8.1.0 != \l\a\t\e\s\t ]] 2025-03-23 20:36:32.077259 | orchestrator | + sed -i 's|^# \(network_dispatcher_scripts:\)$|\1|g' /opt/configuration/inventory/group_vars/testbed-nodes.yml 2025-03-23 20:36:32.077280 | orchestrator | + sed -i 's|^# \( - src: /opt/configuration/network/vxlan.sh\)$|\1|g' /opt/configuration/inventory/group_vars/testbed-nodes.yml /opt/configuration/inventory/group_vars/testbed-managers.yml 2025-03-23 20:36:32.081720 | orchestrator | + sed -i 's|^# \( dest: routable.d/vxlan.sh\)$|\1|g' /opt/configuration/inventory/group_vars/testbed-nodes.yml /opt/configuration/inventory/group_vars/testbed-managers.yml 2025-03-23 20:36:32.086577 | orchestrator | + osism apply operator -u ubuntu -l testbed-nodes 2025-03-23 20:36:33.757414 | orchestrator | 2025-03-23 20:36:33 | INFO  | Task 908e5a41-bf82-4d89-b62c-764074d69924 (operator) was prepared for execution. 2025-03-23 20:36:37.107613 | orchestrator | 2025-03-23 20:36:33 | INFO  | It takes a moment until task 908e5a41-bf82-4d89-b62c-764074d69924 (operator) has been started and output is visible here. 2025-03-23 20:36:37.107744 | orchestrator | 2025-03-23 20:36:37.108301 | orchestrator | PLAY [Make ssh pipelining working] ********************************************* 2025-03-23 20:36:37.114439 | orchestrator | 2025-03-23 20:36:37.115148 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-23 20:36:37.116016 | orchestrator | Sunday 23 March 2025 20:36:37 +0000 (0:00:00.118) 0:00:00.118 ********** 2025-03-23 20:36:40.728486 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:36:40.729586 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:36:40.751680 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:36:40.751710 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:36:40.754890 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:36:40.755579 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:36:40.755922 | orchestrator | 2025-03-23 20:36:40.760602 | orchestrator | TASK [Do not require tty for all users] **************************************** 2025-03-23 20:36:40.761394 | orchestrator | Sunday 23 March 2025 20:36:40 +0000 (0:00:03.626) 0:00:03.745 ********** 2025-03-23 20:36:41.705505 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:36:41.706141 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:36:41.708977 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:36:41.710962 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:36:41.710996 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:36:41.714153 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:36:41.716194 | orchestrator | 2025-03-23 20:36:41.717269 | orchestrator | PLAY [Apply role operator] ***************************************************** 2025-03-23 20:36:41.718165 | orchestrator | 2025-03-23 20:36:41.719718 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2025-03-23 20:36:41.721812 | orchestrator | Sunday 23 March 2025 20:36:41 +0000 (0:00:00.975) 0:00:04.720 ********** 2025-03-23 20:36:41.815197 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:36:41.872143 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:36:41.908400 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:36:41.989326 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:36:41.993021 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:36:42.064742 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:36:42.064774 | orchestrator | 2025-03-23 20:36:42.064790 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2025-03-23 20:36:42.064806 | orchestrator | Sunday 23 March 2025 20:36:41 +0000 (0:00:00.283) 0:00:05.004 ********** 2025-03-23 20:36:42.064825 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:36:42.095339 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:36:42.132273 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:36:42.204399 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:36:42.206063 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:36:42.209017 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:36:42.210468 | orchestrator | 2025-03-23 20:36:42.211956 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2025-03-23 20:36:42.212808 | orchestrator | Sunday 23 March 2025 20:36:42 +0000 (0:00:00.216) 0:00:05.220 ********** 2025-03-23 20:36:42.949918 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:36:42.950420 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:36:42.950937 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:36:42.951399 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:36:42.952376 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:36:42.954576 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:36:42.954685 | orchestrator | 2025-03-23 20:36:42.955657 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2025-03-23 20:36:42.956163 | orchestrator | Sunday 23 March 2025 20:36:42 +0000 (0:00:00.739) 0:00:05.960 ********** 2025-03-23 20:36:43.781127 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:36:43.782336 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:36:43.786266 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:36:43.787011 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:36:43.787064 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:36:43.788195 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:36:43.789138 | orchestrator | 2025-03-23 20:36:43.790612 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2025-03-23 20:36:43.791793 | orchestrator | Sunday 23 March 2025 20:36:43 +0000 (0:00:00.835) 0:00:06.795 ********** 2025-03-23 20:36:45.040512 | orchestrator | changed: [testbed-node-3] => (item=adm) 2025-03-23 20:36:45.041134 | orchestrator | changed: [testbed-node-2] => (item=adm) 2025-03-23 20:36:45.044144 | orchestrator | changed: [testbed-node-0] => (item=adm) 2025-03-23 20:36:45.044485 | orchestrator | changed: [testbed-node-1] => (item=adm) 2025-03-23 20:36:45.044512 | orchestrator | changed: [testbed-node-4] => (item=adm) 2025-03-23 20:36:45.044527 | orchestrator | changed: [testbed-node-5] => (item=adm) 2025-03-23 20:36:45.044541 | orchestrator | changed: [testbed-node-2] => (item=sudo) 2025-03-23 20:36:45.044560 | orchestrator | changed: [testbed-node-0] => (item=sudo) 2025-03-23 20:36:45.045798 | orchestrator | changed: [testbed-node-3] => (item=sudo) 2025-03-23 20:36:45.048019 | orchestrator | changed: [testbed-node-4] => (item=sudo) 2025-03-23 20:36:45.048052 | orchestrator | changed: [testbed-node-1] => (item=sudo) 2025-03-23 20:36:45.048641 | orchestrator | changed: [testbed-node-5] => (item=sudo) 2025-03-23 20:36:45.048706 | orchestrator | 2025-03-23 20:36:45.048729 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2025-03-23 20:36:45.049117 | orchestrator | Sunday 23 March 2025 20:36:45 +0000 (0:00:01.261) 0:00:08.057 ********** 2025-03-23 20:36:46.546461 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:36:46.546851 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:36:46.548327 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:36:46.548676 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:36:46.549648 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:36:46.550556 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:36:46.551045 | orchestrator | 2025-03-23 20:36:46.551661 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2025-03-23 20:36:46.552356 | orchestrator | Sunday 23 March 2025 20:36:46 +0000 (0:00:01.505) 0:00:09.562 ********** 2025-03-23 20:36:47.820024 | orchestrator | [WARNING]: Module remote_tmp /root/.ansible/tmp did not exist and was created 2025-03-23 20:36:47.821577 | orchestrator | with a mode of 0700, this may cause issues when running as another user. To 2025-03-23 20:36:47.822743 | orchestrator | avoid this, create the remote_tmp dir with the correct permissions manually 2025-03-23 20:36:48.076713 | orchestrator | changed: [testbed-node-3] => (item=export LANGUAGE=C.UTF-8) 2025-03-23 20:36:48.077757 | orchestrator | changed: [testbed-node-0] => (item=export LANGUAGE=C.UTF-8) 2025-03-23 20:36:48.077810 | orchestrator | changed: [testbed-node-1] => (item=export LANGUAGE=C.UTF-8) 2025-03-23 20:36:48.077824 | orchestrator | changed: [testbed-node-2] => (item=export LANGUAGE=C.UTF-8) 2025-03-23 20:36:48.077844 | orchestrator | changed: [testbed-node-4] => (item=export LANGUAGE=C.UTF-8) 2025-03-23 20:36:48.078715 | orchestrator | changed: [testbed-node-5] => (item=export LANGUAGE=C.UTF-8) 2025-03-23 20:36:48.079090 | orchestrator | changed: [testbed-node-3] => (item=export LANG=C.UTF-8) 2025-03-23 20:36:48.079531 | orchestrator | changed: [testbed-node-2] => (item=export LANG=C.UTF-8) 2025-03-23 20:36:48.081231 | orchestrator | changed: [testbed-node-1] => (item=export LANG=C.UTF-8) 2025-03-23 20:36:48.081573 | orchestrator | changed: [testbed-node-0] => (item=export LANG=C.UTF-8) 2025-03-23 20:36:48.081601 | orchestrator | changed: [testbed-node-4] => (item=export LANG=C.UTF-8) 2025-03-23 20:36:48.081977 | orchestrator | changed: [testbed-node-5] => (item=export LANG=C.UTF-8) 2025-03-23 20:36:48.082816 | orchestrator | changed: [testbed-node-3] => (item=export LC_ALL=C.UTF-8) 2025-03-23 20:36:48.083079 | orchestrator | changed: [testbed-node-2] => (item=export LC_ALL=C.UTF-8) 2025-03-23 20:36:48.083791 | orchestrator | changed: [testbed-node-1] => (item=export LC_ALL=C.UTF-8) 2025-03-23 20:36:48.084074 | orchestrator | changed: [testbed-node-4] => (item=export LC_ALL=C.UTF-8) 2025-03-23 20:36:48.084882 | orchestrator | changed: [testbed-node-0] => (item=export LC_ALL=C.UTF-8) 2025-03-23 20:36:48.085309 | orchestrator | changed: [testbed-node-5] => (item=export LC_ALL=C.UTF-8) 2025-03-23 20:36:48.086009 | orchestrator | 2025-03-23 20:36:48.086238 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2025-03-23 20:36:48.086841 | orchestrator | Sunday 23 March 2025 20:36:48 +0000 (0:00:01.528) 0:00:11.091 ********** 2025-03-23 20:36:48.811799 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:36:48.811970 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:36:48.815065 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:36:48.816286 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:36:48.817574 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:36:48.819478 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:36:48.822625 | orchestrator | 2025-03-23 20:36:48.827335 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2025-03-23 20:36:48.830744 | orchestrator | Sunday 23 March 2025 20:36:48 +0000 (0:00:00.736) 0:00:11.827 ********** 2025-03-23 20:36:48.884155 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:36:48.915668 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:36:48.946808 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:36:49.005946 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:36:49.007774 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:36:49.009465 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:36:49.012026 | orchestrator | 2025-03-23 20:36:49.014773 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2025-03-23 20:36:49.984929 | orchestrator | Sunday 23 March 2025 20:36:49 +0000 (0:00:00.194) 0:00:12.022 ********** 2025-03-23 20:36:49.985055 | orchestrator | changed: [testbed-node-1] => (item=None) 2025-03-23 20:36:49.985367 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:36:49.985687 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-03-23 20:36:49.986462 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:36:49.986904 | orchestrator | changed: [testbed-node-2] => (item=None) 2025-03-23 20:36:49.987181 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:36:49.987988 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-03-23 20:36:49.991642 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:36:49.992448 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-03-23 20:36:49.993319 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:36:49.993603 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-03-23 20:36:49.994382 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:36:49.994841 | orchestrator | 2025-03-23 20:36:49.995328 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2025-03-23 20:36:49.995663 | orchestrator | Sunday 23 March 2025 20:36:49 +0000 (0:00:00.976) 0:00:12.998 ********** 2025-03-23 20:36:50.085761 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:36:50.126609 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:36:50.160381 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:36:50.199555 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:36:50.200288 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:36:50.200320 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:36:50.203108 | orchestrator | 2025-03-23 20:36:50.205359 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2025-03-23 20:36:50.206663 | orchestrator | Sunday 23 March 2025 20:36:50 +0000 (0:00:00.212) 0:00:13.210 ********** 2025-03-23 20:36:50.243393 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:36:50.279760 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:36:50.311407 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:36:50.356578 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:36:50.400234 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:36:50.400952 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:36:50.401743 | orchestrator | 2025-03-23 20:36:50.403453 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2025-03-23 20:36:50.406487 | orchestrator | Sunday 23 March 2025 20:36:50 +0000 (0:00:00.204) 0:00:13.415 ********** 2025-03-23 20:36:50.463856 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:36:50.497774 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:36:50.530318 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:36:50.567221 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:36:50.608849 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:36:50.610473 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:36:50.610744 | orchestrator | 2025-03-23 20:36:50.611984 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2025-03-23 20:36:50.612069 | orchestrator | Sunday 23 March 2025 20:36:50 +0000 (0:00:00.209) 0:00:13.625 ********** 2025-03-23 20:36:51.354167 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:36:51.354378 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:36:51.354926 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:36:51.355643 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:36:51.356534 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:36:51.356602 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:36:51.357321 | orchestrator | 2025-03-23 20:36:51.358591 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2025-03-23 20:36:51.360173 | orchestrator | Sunday 23 March 2025 20:36:51 +0000 (0:00:00.745) 0:00:14.370 ********** 2025-03-23 20:36:51.437216 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:36:51.468452 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:36:51.494149 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:36:51.609154 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:36:51.609606 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:36:51.609638 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:36:51.610957 | orchestrator | 2025-03-23 20:36:51.611970 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:36:51.612725 | orchestrator | 2025-03-23 20:36:51 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:36:51.613698 | orchestrator | 2025-03-23 20:36:51 | INFO  | Please wait and do not abort execution. 2025-03-23 20:36:51.613734 | orchestrator | testbed-node-0 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-23 20:36:51.614334 | orchestrator | testbed-node-1 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-23 20:36:51.615880 | orchestrator | testbed-node-2 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-23 20:36:51.616977 | orchestrator | testbed-node-3 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-23 20:36:51.617483 | orchestrator | testbed-node-4 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-23 20:36:51.618962 | orchestrator | testbed-node-5 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-23 20:36:51.619219 | orchestrator | 2025-03-23 20:36:51.620029 | orchestrator | Sunday 23 March 2025 20:36:51 +0000 (0:00:00.253) 0:00:14.624 ********** 2025-03-23 20:36:51.620664 | orchestrator | =============================================================================== 2025-03-23 20:36:51.621446 | orchestrator | Gathering Facts --------------------------------------------------------- 3.63s 2025-03-23 20:36:51.622309 | orchestrator | osism.commons.operator : Set language variables in .bashrc configuration file --- 1.53s 2025-03-23 20:36:51.623190 | orchestrator | osism.commons.operator : Copy user sudoers file ------------------------- 1.51s 2025-03-23 20:36:51.623656 | orchestrator | osism.commons.operator : Add user to additional groups ------------------ 1.26s 2025-03-23 20:36:51.624195 | orchestrator | osism.commons.operator : Set ssh authorized keys ------------------------ 0.98s 2025-03-23 20:36:51.625081 | orchestrator | Do not require tty for all users ---------------------------------------- 0.98s 2025-03-23 20:36:51.625579 | orchestrator | osism.commons.operator : Create user ------------------------------------ 0.84s 2025-03-23 20:36:51.625621 | orchestrator | osism.commons.operator : Set password ----------------------------------- 0.75s 2025-03-23 20:36:51.626157 | orchestrator | osism.commons.operator : Create operator group -------------------------- 0.74s 2025-03-23 20:36:51.626567 | orchestrator | osism.commons.operator : Create .ssh directory -------------------------- 0.74s 2025-03-23 20:36:51.626971 | orchestrator | osism.commons.operator : Gather variables for each operating system ----- 0.28s 2025-03-23 20:36:51.627381 | orchestrator | osism.commons.operator : Unset & lock password -------------------------- 0.25s 2025-03-23 20:36:51.627780 | orchestrator | osism.commons.operator : Set operator_groups variable to default value --- 0.22s 2025-03-23 20:36:51.628237 | orchestrator | osism.commons.operator : Delete ssh authorized keys --------------------- 0.21s 2025-03-23 20:36:51.628625 | orchestrator | osism.commons.operator : Delete authorized GitHub accounts -------------- 0.21s 2025-03-23 20:36:51.628937 | orchestrator | osism.commons.operator : Set authorized GitHub accounts ----------------- 0.20s 2025-03-23 20:36:51.629418 | orchestrator | osism.commons.operator : Check number of SSH authorized keys ------------ 0.19s 2025-03-23 20:36:52.126616 | orchestrator | + osism apply --environment custom facts 2025-03-23 20:36:53.626629 | orchestrator | 2025-03-23 20:36:53 | INFO  | Trying to run play facts in environment custom 2025-03-23 20:36:53.682216 | orchestrator | 2025-03-23 20:36:53 | INFO  | Task fe37cb34-801f-4abc-ba27-352a040dc677 (facts) was prepared for execution. 2025-03-23 20:36:56.979754 | orchestrator | 2025-03-23 20:36:53 | INFO  | It takes a moment until task fe37cb34-801f-4abc-ba27-352a040dc677 (facts) has been started and output is visible here. 2025-03-23 20:36:56.979887 | orchestrator | 2025-03-23 20:36:56.980179 | orchestrator | PLAY [Copy custom network devices fact] **************************************** 2025-03-23 20:36:56.980813 | orchestrator | 2025-03-23 20:36:56.980846 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-03-23 20:36:56.984214 | orchestrator | Sunday 23 March 2025 20:36:56 +0000 (0:00:00.095) 0:00:00.095 ********** 2025-03-23 20:36:58.277528 | orchestrator | ok: [testbed-manager] 2025-03-23 20:36:59.374376 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:36:59.376023 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:36:59.376061 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:36:59.377794 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:36:59.379827 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:36:59.382181 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:36:59.384051 | orchestrator | 2025-03-23 20:36:59.384469 | orchestrator | TASK [Copy fact file] ********************************************************** 2025-03-23 20:36:59.385515 | orchestrator | Sunday 23 March 2025 20:36:59 +0000 (0:00:02.395) 0:00:02.491 ********** 2025-03-23 20:37:00.613063 | orchestrator | ok: [testbed-manager] 2025-03-23 20:37:01.523399 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:37:01.524096 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:37:01.526484 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:37:01.530544 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:37:01.531229 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:37:01.531842 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:37:01.532239 | orchestrator | 2025-03-23 20:37:01.537537 | orchestrator | PLAY [Copy custom ceph devices facts] ****************************************** 2025-03-23 20:37:01.537824 | orchestrator | 2025-03-23 20:37:01.538185 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-03-23 20:37:01.538588 | orchestrator | Sunday 23 March 2025 20:37:01 +0000 (0:00:02.149) 0:00:04.640 ********** 2025-03-23 20:37:01.595561 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:01.656731 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:01.656868 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:01.658826 | orchestrator | 2025-03-23 20:37:01.658896 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-03-23 20:37:01.659181 | orchestrator | Sunday 23 March 2025 20:37:01 +0000 (0:00:00.137) 0:00:04.777 ********** 2025-03-23 20:37:01.800146 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:01.801127 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:01.801570 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:01.802255 | orchestrator | 2025-03-23 20:37:01.802934 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-03-23 20:37:01.804021 | orchestrator | Sunday 23 March 2025 20:37:01 +0000 (0:00:00.143) 0:00:04.921 ********** 2025-03-23 20:37:01.934454 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:01.935817 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:01.939229 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:01.939463 | orchestrator | 2025-03-23 20:37:01.939490 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-03-23 20:37:01.939511 | orchestrator | Sunday 23 March 2025 20:37:01 +0000 (0:00:00.133) 0:00:05.055 ********** 2025-03-23 20:37:02.102579 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-03-23 20:37:02.103296 | orchestrator | 2025-03-23 20:37:02.103851 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-03-23 20:37:02.103882 | orchestrator | Sunday 23 March 2025 20:37:02 +0000 (0:00:00.167) 0:00:05.222 ********** 2025-03-23 20:37:02.589082 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:02.589802 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:02.592031 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:02.708128 | orchestrator | 2025-03-23 20:37:02.708163 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-03-23 20:37:02.708178 | orchestrator | Sunday 23 March 2025 20:37:02 +0000 (0:00:00.487) 0:00:05.710 ********** 2025-03-23 20:37:02.708198 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:37:02.708495 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:37:02.708957 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:37:02.709341 | orchestrator | 2025-03-23 20:37:02.709751 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-03-23 20:37:02.710438 | orchestrator | Sunday 23 March 2025 20:37:02 +0000 (0:00:00.119) 0:00:05.829 ********** 2025-03-23 20:37:03.720965 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:37:03.721706 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:37:03.721755 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:37:03.722160 | orchestrator | 2025-03-23 20:37:03.722938 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-03-23 20:37:03.724385 | orchestrator | Sunday 23 March 2025 20:37:03 +0000 (0:00:01.009) 0:00:06.839 ********** 2025-03-23 20:37:04.200315 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:04.201498 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:04.205626 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:05.349664 | orchestrator | 2025-03-23 20:37:05.349780 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-03-23 20:37:05.349800 | orchestrator | Sunday 23 March 2025 20:37:04 +0000 (0:00:00.479) 0:00:07.318 ********** 2025-03-23 20:37:05.349830 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:37:05.351067 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:37:05.351106 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:37:05.353174 | orchestrator | 2025-03-23 20:37:05.353447 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-03-23 20:37:05.353484 | orchestrator | Sunday 23 March 2025 20:37:05 +0000 (0:00:01.148) 0:00:08.467 ********** 2025-03-23 20:37:19.523625 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:37:19.523864 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:37:19.523906 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:37:19.528092 | orchestrator | 2025-03-23 20:37:19.528639 | orchestrator | TASK [Install required packages (RedHat)] ************************************** 2025-03-23 20:37:19.635472 | orchestrator | Sunday 23 March 2025 20:37:19 +0000 (0:00:14.173) 0:00:22.640 ********** 2025-03-23 20:37:19.635563 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:37:19.636825 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:37:19.639401 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:37:19.639881 | orchestrator | 2025-03-23 20:37:19.641042 | orchestrator | TASK [Install required packages (Debian)] ************************************** 2025-03-23 20:37:19.642499 | orchestrator | Sunday 23 March 2025 20:37:19 +0000 (0:00:00.114) 0:00:22.754 ********** 2025-03-23 20:37:28.331352 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:37:28.333183 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:37:28.334105 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:37:28.336217 | orchestrator | 2025-03-23 20:37:28.337474 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-03-23 20:37:28.338337 | orchestrator | Sunday 23 March 2025 20:37:28 +0000 (0:00:08.695) 0:00:31.450 ********** 2025-03-23 20:37:28.849583 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:28.850238 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:28.853316 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:28.854102 | orchestrator | 2025-03-23 20:37:28.854141 | orchestrator | TASK [Copy fact files] ********************************************************* 2025-03-23 20:37:28.855419 | orchestrator | Sunday 23 March 2025 20:37:28 +0000 (0:00:00.520) 0:00:31.970 ********** 2025-03-23 20:37:32.763443 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices) 2025-03-23 20:37:32.765779 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices) 2025-03-23 20:37:32.767776 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices) 2025-03-23 20:37:32.768380 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices_all) 2025-03-23 20:37:32.769196 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices_all) 2025-03-23 20:37:32.770277 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices_all) 2025-03-23 20:37:32.772447 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices) 2025-03-23 20:37:32.775518 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices) 2025-03-23 20:37:32.776002 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices) 2025-03-23 20:37:32.777421 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices_all) 2025-03-23 20:37:32.778184 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices_all) 2025-03-23 20:37:32.779108 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices_all) 2025-03-23 20:37:32.779859 | orchestrator | 2025-03-23 20:37:32.780405 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-03-23 20:37:32.781266 | orchestrator | Sunday 23 March 2025 20:37:32 +0000 (0:00:03.910) 0:00:35.880 ********** 2025-03-23 20:37:33.961676 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:33.963607 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:33.965206 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:33.966356 | orchestrator | 2025-03-23 20:37:33.967970 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-03-23 20:37:33.969151 | orchestrator | 2025-03-23 20:37:33.970202 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-23 20:37:33.970401 | orchestrator | Sunday 23 March 2025 20:37:33 +0000 (0:00:01.198) 0:00:37.079 ********** 2025-03-23 20:37:35.794949 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:37:39.170971 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:37:39.171396 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:37:39.171582 | orchestrator | ok: [testbed-manager] 2025-03-23 20:37:39.171835 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:39.172698 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:39.172801 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:39.173771 | orchestrator | 2025-03-23 20:37:39.174510 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:37:39.175416 | orchestrator | 2025-03-23 20:37:39 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:37:39.176570 | orchestrator | 2025-03-23 20:37:39 | INFO  | Please wait and do not abort execution. 2025-03-23 20:37:39.176605 | orchestrator | testbed-manager : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:37:39.177134 | orchestrator | testbed-node-0 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:37:39.177160 | orchestrator | testbed-node-1 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:37:39.177180 | orchestrator | testbed-node-2 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:37:39.178090 | orchestrator | testbed-node-3 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:37:39.178238 | orchestrator | testbed-node-4 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:37:39.178664 | orchestrator | testbed-node-5 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:37:39.179335 | orchestrator | 2025-03-23 20:37:39.179441 | orchestrator | Sunday 23 March 2025 20:37:39 +0000 (0:00:05.212) 0:00:42.291 ********** 2025-03-23 20:37:39.180194 | orchestrator | =============================================================================== 2025-03-23 20:37:39.180953 | orchestrator | osism.commons.repository : Update package cache ------------------------ 14.17s 2025-03-23 20:37:39.181486 | orchestrator | Install required packages (Debian) -------------------------------------- 8.70s 2025-03-23 20:37:39.181515 | orchestrator | Gathers facts about hosts ----------------------------------------------- 5.21s 2025-03-23 20:37:39.182407 | orchestrator | Copy fact files --------------------------------------------------------- 3.91s 2025-03-23 20:37:39.182753 | orchestrator | Create custom facts directory ------------------------------------------- 2.40s 2025-03-23 20:37:39.183175 | orchestrator | Copy fact file ---------------------------------------------------------- 2.15s 2025-03-23 20:37:39.183672 | orchestrator | osism.commons.repository : Force update of package cache ---------------- 1.20s 2025-03-23 20:37:39.183930 | orchestrator | osism.commons.repository : Copy ubuntu.sources file --------------------- 1.15s 2025-03-23 20:37:39.184400 | orchestrator | osism.commons.repository : Copy 99osism apt configuration --------------- 1.01s 2025-03-23 20:37:39.184939 | orchestrator | Create custom facts directory ------------------------------------------- 0.52s 2025-03-23 20:37:39.185453 | orchestrator | osism.commons.repository : Create /etc/apt/sources.list.d directory ----- 0.49s 2025-03-23 20:37:39.185945 | orchestrator | osism.commons.repository : Remove sources.list file --------------------- 0.48s 2025-03-23 20:37:39.186642 | orchestrator | osism.commons.repository : Include distribution specific repository tasks --- 0.17s 2025-03-23 20:37:39.188087 | orchestrator | osism.commons.repository : Set repository_default fact to default value --- 0.14s 2025-03-23 20:37:39.189107 | orchestrator | osism.commons.repository : Gather variables for each operating system --- 0.14s 2025-03-23 20:37:39.189133 | orchestrator | osism.commons.repository : Set repositories to default ------------------ 0.13s 2025-03-23 20:37:39.189147 | orchestrator | osism.commons.repository : Include tasks for Ubuntu < 24.04 ------------- 0.12s 2025-03-23 20:37:39.189166 | orchestrator | Install required packages (RedHat) -------------------------------------- 0.11s 2025-03-23 20:37:39.736850 | orchestrator | + osism apply bootstrap 2025-03-23 20:37:41.367008 | orchestrator | 2025-03-23 20:37:41 | INFO  | Task e3d0fe5a-e08f-4ad7-8cdc-416a95828406 (bootstrap) was prepared for execution. 2025-03-23 20:37:41.367717 | orchestrator | 2025-03-23 20:37:41 | INFO  | It takes a moment until task e3d0fe5a-e08f-4ad7-8cdc-416a95828406 (bootstrap) has been started and output is visible here. 2025-03-23 20:37:45.067120 | orchestrator | 2025-03-23 20:37:45.070103 | orchestrator | PLAY [Group hosts based on state bootstrap] ************************************ 2025-03-23 20:37:45.071012 | orchestrator | 2025-03-23 20:37:45.072115 | orchestrator | TASK [Group hosts based on state bootstrap] ************************************ 2025-03-23 20:37:45.073150 | orchestrator | Sunday 23 March 2025 20:37:45 +0000 (0:00:00.117) 0:00:00.117 ********** 2025-03-23 20:37:45.147403 | orchestrator | ok: [testbed-manager] 2025-03-23 20:37:45.190508 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:45.213555 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:45.259290 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:45.359069 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:37:45.360385 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:37:45.362207 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:37:45.363135 | orchestrator | 2025-03-23 20:37:45.366207 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-03-23 20:37:49.189046 | orchestrator | 2025-03-23 20:37:49.189171 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-23 20:37:49.189192 | orchestrator | Sunday 23 March 2025 20:37:45 +0000 (0:00:00.288) 0:00:00.406 ********** 2025-03-23 20:37:49.189224 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:37:49.189607 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:37:49.190418 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:37:49.191490 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:49.192041 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:49.193184 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:49.194118 | orchestrator | ok: [testbed-manager] 2025-03-23 20:37:49.194437 | orchestrator | 2025-03-23 20:37:49.195218 | orchestrator | PLAY [Gather facts for all hosts (if using --limit)] *************************** 2025-03-23 20:37:49.195576 | orchestrator | 2025-03-23 20:37:49.196004 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-23 20:37:49.196479 | orchestrator | Sunday 23 March 2025 20:37:49 +0000 (0:00:03.832) 0:00:04.238 ********** 2025-03-23 20:37:49.322433 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2025-03-23 20:37:49.324338 | orchestrator | skipping: [testbed-node-3] => (item=testbed-manager)  2025-03-23 20:37:49.377830 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2025-03-23 20:37:49.377914 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-03-23 20:37:49.377930 | orchestrator | skipping: [testbed-node-4] => (item=testbed-manager)  2025-03-23 20:37:49.377945 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2025-03-23 20:37:49.377958 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-03-23 20:37:49.377985 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-3)  2025-03-23 20:37:49.378153 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2025-03-23 20:37:49.378233 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-03-23 20:37:49.378654 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-03-23 20:37:49.378954 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-4)  2025-03-23 20:37:49.379049 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-03-23 20:37:49.380031 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-5)  2025-03-23 20:37:49.428673 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2025-03-23 20:37:49.428747 | orchestrator | skipping: [testbed-node-5] => (item=testbed-manager)  2025-03-23 20:37:49.428825 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2025-03-23 20:37:49.428987 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-03-23 20:37:49.429393 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2025-03-23 20:37:49.711508 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2025-03-23 20:37:49.712562 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-3)  2025-03-23 20:37:49.713334 | orchestrator | skipping: [testbed-node-0] => (item=testbed-manager)  2025-03-23 20:37:49.714262 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:37:49.715167 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2025-03-23 20:37:49.716131 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:37:49.719150 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2025-03-23 20:37:49.719411 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:37:49.719437 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-3)  2025-03-23 20:37:49.719457 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-4)  2025-03-23 20:37:49.720130 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-4)  2025-03-23 20:37:49.720830 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-5)  2025-03-23 20:37:49.722110 | orchestrator | skipping: [testbed-node-1] => (item=testbed-manager)  2025-03-23 20:37:49.722477 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-3)  2025-03-23 20:37:49.723099 | orchestrator | skipping: [testbed-node-2] => (item=testbed-manager)  2025-03-23 20:37:49.723833 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2025-03-23 20:37:49.724545 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-5)  2025-03-23 20:37:49.725175 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-3)  2025-03-23 20:37:49.726126 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-03-23 20:37:49.726466 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2025-03-23 20:37:49.727365 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-4)  2025-03-23 20:37:49.728261 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2025-03-23 20:37:49.728930 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-03-23 20:37:49.729943 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:37:49.730076 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-4)  2025-03-23 20:37:49.730724 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-5)  2025-03-23 20:37:49.731202 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-03-23 20:37:49.731447 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:37:49.731897 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-5)  2025-03-23 20:37:49.732530 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2025-03-23 20:37:49.732856 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2025-03-23 20:37:49.733346 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2025-03-23 20:37:49.733872 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2025-03-23 20:37:49.734340 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2025-03-23 20:37:49.734803 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:37:49.736136 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2025-03-23 20:37:49.736600 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:37:49.736625 | orchestrator | 2025-03-23 20:37:49.736645 | orchestrator | PLAY [Apply bootstrap roles part 1] ******************************************** 2025-03-23 20:37:49.737045 | orchestrator | 2025-03-23 20:37:49.737488 | orchestrator | TASK [osism.commons.hostname : Set hostname_name fact] ************************* 2025-03-23 20:37:49.738355 | orchestrator | Sunday 23 March 2025 20:37:49 +0000 (0:00:00.526) 0:00:04.764 ********** 2025-03-23 20:37:49.806375 | orchestrator | ok: [testbed-manager] 2025-03-23 20:37:49.837119 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:49.863288 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:49.892787 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:49.956581 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:37:49.957229 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:37:49.958239 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:37:49.958592 | orchestrator | 2025-03-23 20:37:49.958972 | orchestrator | TASK [osism.commons.hostname : Set hostname] *********************************** 2025-03-23 20:37:49.959675 | orchestrator | Sunday 23 March 2025 20:37:49 +0000 (0:00:00.244) 0:00:05.009 ********** 2025-03-23 20:37:51.307908 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:51.308400 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:37:51.309217 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:51.309668 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:51.310418 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:37:51.311060 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:37:51.311462 | orchestrator | ok: [testbed-manager] 2025-03-23 20:37:51.312145 | orchestrator | 2025-03-23 20:37:51.313184 | orchestrator | TASK [osism.commons.hostname : Copy /etc/hostname] ***************************** 2025-03-23 20:37:51.313795 | orchestrator | Sunday 23 March 2025 20:37:51 +0000 (0:00:01.349) 0:00:06.359 ********** 2025-03-23 20:37:52.611394 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:37:52.612664 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:37:52.612699 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:37:52.614545 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:37:52.615331 | orchestrator | ok: [testbed-manager] 2025-03-23 20:37:52.616666 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:37:52.617271 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:37:52.618439 | orchestrator | 2025-03-23 20:37:52.618949 | orchestrator | TASK [osism.commons.hosts : Include type specific tasks] *********************** 2025-03-23 20:37:52.620199 | orchestrator | Sunday 23 March 2025 20:37:52 +0000 (0:00:01.301) 0:00:07.660 ********** 2025-03-23 20:37:52.930426 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/hosts/tasks/type-template.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:37:52.930610 | orchestrator | 2025-03-23 20:37:52.930859 | orchestrator | TASK [osism.commons.hosts : Copy /etc/hosts file] ****************************** 2025-03-23 20:37:52.931328 | orchestrator | Sunday 23 March 2025 20:37:52 +0000 (0:00:00.322) 0:00:07.983 ********** 2025-03-23 20:37:55.434720 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:37:55.437945 | orchestrator | changed: [testbed-manager] 2025-03-23 20:37:55.440108 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:37:55.440346 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:37:55.441503 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:37:55.442693 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:37:55.443375 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:37:55.444155 | orchestrator | 2025-03-23 20:37:55.444824 | orchestrator | TASK [osism.commons.proxy : Include distribution specific tasks] *************** 2025-03-23 20:37:55.445392 | orchestrator | Sunday 23 March 2025 20:37:55 +0000 (0:00:02.500) 0:00:10.484 ********** 2025-03-23 20:37:55.517399 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:37:55.752548 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/proxy/tasks/Debian-family.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:37:55.753242 | orchestrator | 2025-03-23 20:37:55.754455 | orchestrator | TASK [osism.commons.proxy : Configure proxy parameters for apt] **************** 2025-03-23 20:37:55.755068 | orchestrator | Sunday 23 March 2025 20:37:55 +0000 (0:00:00.320) 0:00:10.804 ********** 2025-03-23 20:37:56.928081 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:37:56.928953 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:37:56.928989 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:37:56.929006 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:37:56.929021 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:37:56.929035 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:37:56.929057 | orchestrator | 2025-03-23 20:37:56.929699 | orchestrator | TASK [osism.commons.proxy : Set system wide settings in environment file] ****** 2025-03-23 20:37:56.930432 | orchestrator | Sunday 23 March 2025 20:37:56 +0000 (0:00:01.170) 0:00:11.974 ********** 2025-03-23 20:37:56.973076 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:37:57.543861 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:37:57.544024 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:37:57.545121 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:37:57.550602 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:37:57.550869 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:37:57.552070 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:37:57.552101 | orchestrator | 2025-03-23 20:37:57.661861 | orchestrator | TASK [osism.commons.proxy : Remove system wide settings in environment file] *** 2025-03-23 20:37:57.661928 | orchestrator | Sunday 23 March 2025 20:37:57 +0000 (0:00:00.620) 0:00:12.595 ********** 2025-03-23 20:37:57.661954 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:37:57.690844 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:37:57.723123 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:37:58.083003 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:37:58.084465 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:37:58.086442 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:37:58.087400 | orchestrator | ok: [testbed-manager] 2025-03-23 20:37:58.088742 | orchestrator | 2025-03-23 20:37:58.089813 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2025-03-23 20:37:58.090362 | orchestrator | Sunday 23 March 2025 20:37:58 +0000 (0:00:00.540) 0:00:13.135 ********** 2025-03-23 20:37:58.160524 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:37:58.193188 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:37:58.222109 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:37:58.250273 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:37:58.315669 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:37:58.316447 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:37:58.317123 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:37:58.318079 | orchestrator | 2025-03-23 20:37:58.318537 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2025-03-23 20:37:58.319563 | orchestrator | Sunday 23 March 2025 20:37:58 +0000 (0:00:00.232) 0:00:13.367 ********** 2025-03-23 20:37:58.654049 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:37:58.655368 | orchestrator | 2025-03-23 20:37:58.656094 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2025-03-23 20:37:58.657547 | orchestrator | Sunday 23 March 2025 20:37:58 +0000 (0:00:00.337) 0:00:13.704 ********** 2025-03-23 20:37:59.001830 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:37:59.002092 | orchestrator | 2025-03-23 20:37:59.002577 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2025-03-23 20:37:59.003263 | orchestrator | Sunday 23 March 2025 20:37:58 +0000 (0:00:00.350) 0:00:14.054 ********** 2025-03-23 20:38:00.518694 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:00.519566 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:00.521159 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:00.521520 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:00.522532 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:00.523237 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:00.524289 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:00.524541 | orchestrator | 2025-03-23 20:38:00.525013 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2025-03-23 20:38:00.526079 | orchestrator | Sunday 23 March 2025 20:38:00 +0000 (0:00:01.513) 0:00:15.568 ********** 2025-03-23 20:38:00.609473 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:38:00.636269 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:38:00.660609 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:38:00.688556 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:38:00.747402 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:38:00.748473 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:38:00.750086 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:38:00.750957 | orchestrator | 2025-03-23 20:38:00.752194 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2025-03-23 20:38:00.753067 | orchestrator | Sunday 23 March 2025 20:38:00 +0000 (0:00:00.231) 0:00:15.800 ********** 2025-03-23 20:38:01.337912 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:01.341547 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:01.342641 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:01.342675 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:01.342692 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:01.342712 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:01.343111 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:01.343840 | orchestrator | 2025-03-23 20:38:01.344215 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2025-03-23 20:38:01.345232 | orchestrator | Sunday 23 March 2025 20:38:01 +0000 (0:00:00.589) 0:00:16.389 ********** 2025-03-23 20:38:01.430873 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:38:01.459788 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:38:01.492469 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:38:01.519923 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:38:01.601493 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:38:01.604370 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:38:01.605293 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:38:01.605345 | orchestrator | 2025-03-23 20:38:01.605368 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2025-03-23 20:38:01.605676 | orchestrator | Sunday 23 March 2025 20:38:01 +0000 (0:00:00.261) 0:00:16.650 ********** 2025-03-23 20:38:02.154625 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:02.154795 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:38:02.154993 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:38:02.155025 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:38:02.156634 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:02.157474 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:02.158387 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:02.158756 | orchestrator | 2025-03-23 20:38:02.159204 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2025-03-23 20:38:03.334979 | orchestrator | Sunday 23 March 2025 20:38:02 +0000 (0:00:00.556) 0:00:17.207 ********** 2025-03-23 20:38:03.335114 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:03.336349 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:38:03.337335 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:38:03.338541 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:38:03.339607 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:03.340393 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:03.340766 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:03.341423 | orchestrator | 2025-03-23 20:38:03.341919 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2025-03-23 20:38:03.342636 | orchestrator | Sunday 23 March 2025 20:38:03 +0000 (0:00:01.177) 0:00:18.385 ********** 2025-03-23 20:38:04.662449 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:04.662655 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:04.663811 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:04.665028 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:04.666351 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:04.668107 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:04.668662 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:04.669706 | orchestrator | 2025-03-23 20:38:04.670450 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2025-03-23 20:38:04.671093 | orchestrator | Sunday 23 March 2025 20:38:04 +0000 (0:00:01.327) 0:00:19.712 ********** 2025-03-23 20:38:05.030916 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:38:05.031848 | orchestrator | 2025-03-23 20:38:05.032905 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2025-03-23 20:38:05.033844 | orchestrator | Sunday 23 March 2025 20:38:05 +0000 (0:00:00.370) 0:00:20.082 ********** 2025-03-23 20:38:05.115484 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:38:06.660690 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:38:06.662205 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:06.665452 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:06.665873 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:06.665897 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:38:06.665918 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:38:06.667850 | orchestrator | 2025-03-23 20:38:06.667927 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-03-23 20:38:06.668807 | orchestrator | Sunday 23 March 2025 20:38:06 +0000 (0:00:01.627) 0:00:21.710 ********** 2025-03-23 20:38:06.767923 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:06.798491 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:06.831143 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:06.857666 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:06.921238 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:06.922297 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:06.923563 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:06.923887 | orchestrator | 2025-03-23 20:38:06.924793 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-03-23 20:38:06.925546 | orchestrator | Sunday 23 March 2025 20:38:06 +0000 (0:00:00.262) 0:00:21.973 ********** 2025-03-23 20:38:07.038421 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:07.065288 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:07.103484 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:07.173985 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:07.175129 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:07.175767 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:07.177365 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:07.178615 | orchestrator | 2025-03-23 20:38:07.179647 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-03-23 20:38:07.180480 | orchestrator | Sunday 23 March 2025 20:38:07 +0000 (0:00:00.253) 0:00:22.226 ********** 2025-03-23 20:38:07.265744 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:07.292465 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:07.328639 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:07.355476 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:07.427950 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:07.428745 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:07.430123 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:07.430573 | orchestrator | 2025-03-23 20:38:07.431253 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-03-23 20:38:07.432353 | orchestrator | Sunday 23 March 2025 20:38:07 +0000 (0:00:00.253) 0:00:22.480 ********** 2025-03-23 20:38:07.766405 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:38:07.766727 | orchestrator | 2025-03-23 20:38:07.767649 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-03-23 20:38:07.768146 | orchestrator | Sunday 23 March 2025 20:38:07 +0000 (0:00:00.337) 0:00:22.818 ********** 2025-03-23 20:38:08.366782 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:08.367229 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:08.367466 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:08.367965 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:08.368273 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:08.368940 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:08.371080 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:08.371573 | orchestrator | 2025-03-23 20:38:08.372609 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-03-23 20:38:08.373177 | orchestrator | Sunday 23 March 2025 20:38:08 +0000 (0:00:00.600) 0:00:23.419 ********** 2025-03-23 20:38:08.459171 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:38:08.489653 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:38:08.525381 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:38:08.561295 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:38:08.658501 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:38:08.658597 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:38:08.659449 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:38:08.659672 | orchestrator | 2025-03-23 20:38:08.660156 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-03-23 20:38:08.660492 | orchestrator | Sunday 23 March 2025 20:38:08 +0000 (0:00:00.291) 0:00:23.710 ********** 2025-03-23 20:38:09.766626 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:09.767062 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:09.767090 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:09.767111 | orchestrator | changed: [testbed-manager] 2025-03-23 20:38:09.767384 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:09.767445 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:09.769183 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:09.769374 | orchestrator | 2025-03-23 20:38:09.770225 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-03-23 20:38:09.770481 | orchestrator | Sunday 23 March 2025 20:38:09 +0000 (0:00:01.107) 0:00:24.817 ********** 2025-03-23 20:38:10.468787 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:10.469180 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:10.469469 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:10.470468 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:10.470886 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:10.471529 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:10.475070 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:10.475310 | orchestrator | 2025-03-23 20:38:10.475582 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-03-23 20:38:10.476186 | orchestrator | Sunday 23 March 2025 20:38:10 +0000 (0:00:00.703) 0:00:25.521 ********** 2025-03-23 20:38:11.729973 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:11.731284 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:11.731345 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:11.731362 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:11.731384 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:11.732842 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:11.733767 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:11.735025 | orchestrator | 2025-03-23 20:38:11.735485 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-03-23 20:38:11.735970 | orchestrator | Sunday 23 March 2025 20:38:11 +0000 (0:00:01.258) 0:00:26.779 ********** 2025-03-23 20:38:26.357837 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:26.360290 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:26.361735 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:26.362419 | orchestrator | changed: [testbed-manager] 2025-03-23 20:38:26.362823 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:26.363389 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:26.363596 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:26.364597 | orchestrator | 2025-03-23 20:38:26.367466 | orchestrator | TASK [osism.services.rsyslog : Gather variables for each operating system] ***** 2025-03-23 20:38:26.367714 | orchestrator | Sunday 23 March 2025 20:38:26 +0000 (0:00:14.624) 0:00:41.404 ********** 2025-03-23 20:38:26.439571 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:26.477921 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:26.505828 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:26.538231 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:26.598949 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:26.600456 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:26.602651 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:26.603395 | orchestrator | 2025-03-23 20:38:26.603430 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_user variable to default value] ***** 2025-03-23 20:38:26.603955 | orchestrator | Sunday 23 March 2025 20:38:26 +0000 (0:00:00.247) 0:00:41.651 ********** 2025-03-23 20:38:26.679860 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:26.708537 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:26.739533 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:26.778688 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:26.850453 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:26.850587 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:26.850904 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:26.850933 | orchestrator | 2025-03-23 20:38:26.852428 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_workdir variable to default value] *** 2025-03-23 20:38:26.853378 | orchestrator | Sunday 23 March 2025 20:38:26 +0000 (0:00:00.250) 0:00:41.902 ********** 2025-03-23 20:38:26.911198 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:26.977451 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:27.004244 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:27.035652 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:27.115285 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:27.116050 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:27.116593 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:27.117541 | orchestrator | 2025-03-23 20:38:27.118778 | orchestrator | TASK [osism.services.rsyslog : Include distribution specific install tasks] **** 2025-03-23 20:38:27.121056 | orchestrator | Sunday 23 March 2025 20:38:27 +0000 (0:00:00.264) 0:00:42.167 ********** 2025-03-23 20:38:27.460678 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:38:27.462867 | orchestrator | 2025-03-23 20:38:27.464127 | orchestrator | TASK [osism.services.rsyslog : Install rsyslog package] ************************ 2025-03-23 20:38:27.465279 | orchestrator | Sunday 23 March 2025 20:38:27 +0000 (0:00:00.344) 0:00:42.511 ********** 2025-03-23 20:38:29.357105 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:29.358483 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:29.359742 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:29.360504 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:29.361693 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:29.362487 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:29.362950 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:29.363265 | orchestrator | 2025-03-23 20:38:29.364464 | orchestrator | TASK [osism.services.rsyslog : Copy rsyslog.conf configuration file] *********** 2025-03-23 20:38:29.364536 | orchestrator | Sunday 23 March 2025 20:38:29 +0000 (0:00:01.897) 0:00:44.408 ********** 2025-03-23 20:38:30.548838 | orchestrator | changed: [testbed-manager] 2025-03-23 20:38:30.551523 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:38:30.551611 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:38:30.554082 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:38:30.555907 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:30.556796 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:30.557887 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:30.559052 | orchestrator | 2025-03-23 20:38:30.559792 | orchestrator | TASK [osism.services.rsyslog : Manage rsyslog service] ************************* 2025-03-23 20:38:30.560688 | orchestrator | Sunday 23 March 2025 20:38:30 +0000 (0:00:01.189) 0:00:45.598 ********** 2025-03-23 20:38:31.684238 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:31.684430 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:31.684925 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:31.685866 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:31.686482 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:31.687014 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:31.687571 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:31.687740 | orchestrator | 2025-03-23 20:38:31.688884 | orchestrator | TASK [osism.services.rsyslog : Include fluentd tasks] ************************** 2025-03-23 20:38:31.689138 | orchestrator | Sunday 23 March 2025 20:38:31 +0000 (0:00:01.137) 0:00:46.735 ********** 2025-03-23 20:38:31.991099 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/fluentd.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:38:31.991395 | orchestrator | 2025-03-23 20:38:31.992429 | orchestrator | TASK [osism.services.rsyslog : Forward syslog message to local fluentd daemon] *** 2025-03-23 20:38:31.993401 | orchestrator | Sunday 23 March 2025 20:38:31 +0000 (0:00:00.306) 0:00:47.042 ********** 2025-03-23 20:38:33.079193 | orchestrator | changed: [testbed-manager] 2025-03-23 20:38:33.080487 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:38:33.082256 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:38:33.083456 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:38:33.083930 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:33.084380 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:33.084616 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:33.085473 | orchestrator | 2025-03-23 20:38:33.088166 | orchestrator | TASK [osism.services.rsyslog : Include additional log server tasks] ************ 2025-03-23 20:38:33.088198 | orchestrator | Sunday 23 March 2025 20:38:33 +0000 (0:00:01.085) 0:00:48.128 ********** 2025-03-23 20:38:33.186961 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:38:33.212872 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:38:33.241245 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:38:33.425272 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:38:33.426155 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:38:33.426976 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:38:33.428001 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:38:33.428443 | orchestrator | 2025-03-23 20:38:33.429165 | orchestrator | TASK [osism.commons.systohc : Install util-linux-extra package] **************** 2025-03-23 20:38:33.429662 | orchestrator | Sunday 23 March 2025 20:38:33 +0000 (0:00:00.347) 0:00:48.476 ********** 2025-03-23 20:38:47.328193 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:38:47.328509 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:38:47.329709 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:47.330674 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:38:47.332571 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:47.333618 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:47.334271 | orchestrator | changed: [testbed-manager] 2025-03-23 20:38:47.334806 | orchestrator | 2025-03-23 20:38:47.335561 | orchestrator | TASK [osism.commons.systohc : Sync hardware clock] ***************************** 2025-03-23 20:38:47.336194 | orchestrator | Sunday 23 March 2025 20:38:47 +0000 (0:00:13.900) 0:01:02.376 ********** 2025-03-23 20:38:48.677679 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:48.677856 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:48.678922 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:48.679995 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:48.680368 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:48.681769 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:48.682301 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:48.683236 | orchestrator | 2025-03-23 20:38:48.684081 | orchestrator | TASK [osism.commons.configfs : Start sys-kernel-config mount] ****************** 2025-03-23 20:38:48.685061 | orchestrator | Sunday 23 March 2025 20:38:48 +0000 (0:00:01.351) 0:01:03.727 ********** 2025-03-23 20:38:49.770601 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:49.771497 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:49.772499 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:49.773429 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:49.774497 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:49.777365 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:49.778090 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:49.779910 | orchestrator | 2025-03-23 20:38:49.780525 | orchestrator | TASK [osism.commons.packages : Gather variables for each operating system] ***** 2025-03-23 20:38:49.781399 | orchestrator | Sunday 23 March 2025 20:38:49 +0000 (0:00:01.090) 0:01:04.818 ********** 2025-03-23 20:38:49.851073 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:49.883523 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:49.912590 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:49.949294 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:50.020601 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:50.021997 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:50.025238 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:50.026154 | orchestrator | 2025-03-23 20:38:50.026950 | orchestrator | TASK [osism.commons.packages : Set required_packages_distribution variable to default value] *** 2025-03-23 20:38:50.027324 | orchestrator | Sunday 23 March 2025 20:38:50 +0000 (0:00:00.254) 0:01:05.072 ********** 2025-03-23 20:38:50.101825 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:50.134645 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:50.167395 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:50.195562 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:50.279814 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:50.280948 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:50.281396 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:50.286480 | orchestrator | 2025-03-23 20:38:50.288174 | orchestrator | TASK [osism.commons.packages : Include distribution specific package tasks] **** 2025-03-23 20:38:50.288559 | orchestrator | Sunday 23 March 2025 20:38:50 +0000 (0:00:00.260) 0:01:05.332 ********** 2025-03-23 20:38:50.624333 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/packages/tasks/package-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:38:50.624560 | orchestrator | 2025-03-23 20:38:50.628285 | orchestrator | TASK [osism.commons.packages : Install needrestart package] ******************** 2025-03-23 20:38:50.628603 | orchestrator | Sunday 23 March 2025 20:38:50 +0000 (0:00:00.342) 0:01:05.674 ********** 2025-03-23 20:38:52.458645 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:52.459034 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:52.459277 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:52.459312 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:52.460092 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:52.460767 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:52.460963 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:52.461408 | orchestrator | 2025-03-23 20:38:52.461713 | orchestrator | TASK [osism.commons.packages : Set needrestart mode] *************************** 2025-03-23 20:38:52.462309 | orchestrator | Sunday 23 March 2025 20:38:52 +0000 (0:00:01.834) 0:01:07.509 ********** 2025-03-23 20:38:53.104758 | orchestrator | changed: [testbed-manager] 2025-03-23 20:38:53.105459 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:38:53.106717 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:38:53.108522 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:53.108602 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:53.109798 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:38:53.110859 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:53.112177 | orchestrator | 2025-03-23 20:38:53.112772 | orchestrator | TASK [osism.commons.packages : Set apt_cache_valid_time variable to default value] *** 2025-03-23 20:38:53.113624 | orchestrator | Sunday 23 March 2025 20:38:53 +0000 (0:00:00.646) 0:01:08.156 ********** 2025-03-23 20:38:53.192232 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:53.233576 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:53.253989 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:53.297404 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:53.376818 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:53.377263 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:53.378521 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:53.381389 | orchestrator | 2025-03-23 20:38:54.774086 | orchestrator | TASK [osism.commons.packages : Update package cache] *************************** 2025-03-23 20:38:54.774198 | orchestrator | Sunday 23 March 2025 20:38:53 +0000 (0:00:00.273) 0:01:08.429 ********** 2025-03-23 20:38:54.774232 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:54.774606 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:54.776518 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:54.777495 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:54.777929 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:54.778610 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:54.779052 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:54.779083 | orchestrator | 2025-03-23 20:38:54.779807 | orchestrator | TASK [osism.commons.packages : Download upgrade packages] ********************** 2025-03-23 20:38:54.780124 | orchestrator | Sunday 23 March 2025 20:38:54 +0000 (0:00:01.393) 0:01:09.823 ********** 2025-03-23 20:38:56.963800 | orchestrator | changed: [testbed-manager] 2025-03-23 20:38:56.964618 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:38:56.964662 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:38:56.968166 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:38:56.969513 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:38:56.970183 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:38:56.971178 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:38:56.971509 | orchestrator | 2025-03-23 20:38:56.972339 | orchestrator | TASK [osism.commons.packages : Upgrade packages] ******************************* 2025-03-23 20:38:59.774831 | orchestrator | Sunday 23 March 2025 20:38:56 +0000 (0:00:02.190) 0:01:12.014 ********** 2025-03-23 20:38:59.774988 | orchestrator | ok: [testbed-manager] 2025-03-23 20:38:59.775055 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:38:59.775074 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:38:59.775093 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:38:59.775635 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:38:59.775715 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:38:59.776232 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:38:59.777553 | orchestrator | 2025-03-23 20:38:59.777962 | orchestrator | TASK [osism.commons.packages : Download required packages] ********************* 2025-03-23 20:38:59.778437 | orchestrator | Sunday 23 March 2025 20:38:59 +0000 (0:00:02.809) 0:01:14.824 ********** 2025-03-23 20:39:42.059095 | orchestrator | ok: [testbed-manager] 2025-03-23 20:39:42.060346 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:39:42.060414 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:39:42.065281 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:39:42.065471 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:39:42.066073 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:39:42.066808 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:39:42.067600 | orchestrator | 2025-03-23 20:39:42.068079 | orchestrator | TASK [osism.commons.packages : Install required packages] ********************** 2025-03-23 20:39:42.068616 | orchestrator | Sunday 23 March 2025 20:39:42 +0000 (0:00:42.284) 0:01:57.108 ********** 2025-03-23 20:41:00.802861 | orchestrator | changed: [testbed-manager] 2025-03-23 20:41:02.933197 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:41:02.933326 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:41:02.933346 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:41:02.933380 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:41:02.933457 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:41:02.933472 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:41:02.933486 | orchestrator | 2025-03-23 20:41:02.933502 | orchestrator | TASK [osism.commons.packages : Remove useless packages from the cache] ********* 2025-03-23 20:41:02.933518 | orchestrator | Sunday 23 March 2025 20:41:00 +0000 (0:01:18.736) 0:03:15.844 ********** 2025-03-23 20:41:02.933550 | orchestrator | ok: [testbed-manager] 2025-03-23 20:41:02.934086 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:41:02.934139 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:41:02.934340 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:41:02.935499 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:41:02.936783 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:41:02.936813 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:41:02.938676 | orchestrator | 2025-03-23 20:41:02.939435 | orchestrator | TASK [osism.commons.packages : Remove dependencies that are no longer required] *** 2025-03-23 20:41:02.940524 | orchestrator | Sunday 23 March 2025 20:41:02 +0000 (0:00:02.139) 0:03:17.984 ********** 2025-03-23 20:41:16.259139 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:41:16.259320 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:41:16.259348 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:41:16.261288 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:41:16.261754 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:41:16.262957 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:41:16.263455 | orchestrator | changed: [testbed-manager] 2025-03-23 20:41:16.264278 | orchestrator | 2025-03-23 20:41:16.265249 | orchestrator | TASK [osism.commons.sysctl : Include sysctl tasks] ***************************** 2025-03-23 20:41:16.265739 | orchestrator | Sunday 23 March 2025 20:41:16 +0000 (0:00:13.322) 0:03:31.306 ********** 2025-03-23 20:41:16.743075 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'elasticsearch', 'value': [{'name': 'vm.max_map_count', 'value': 262144}]}) 2025-03-23 20:41:16.743508 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'rabbitmq', 'value': [{'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}, {'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}, {'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}, {'name': 'net.core.wmem_max', 'value': 16777216}, {'name': 'net.core.rmem_max', 'value': 16777216}, {'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}, {'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}, {'name': 'net.core.somaxconn', 'value': 4096}, {'name': 'net.ipv4.tcp_syncookies', 'value': 0}, {'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}]}) 2025-03-23 20:41:16.744275 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'generic', 'value': [{'name': 'vm.swappiness', 'value': 1}]}) 2025-03-23 20:41:16.745067 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'compute', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2025-03-23 20:41:16.746015 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'k3s_node', 'value': [{'name': 'fs.inotify.max_user_instances', 'value': 1024}]}) 2025-03-23 20:41:16.746697 | orchestrator | 2025-03-23 20:41:16.747037 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on elasticsearch] *********** 2025-03-23 20:41:16.747785 | orchestrator | Sunday 23 March 2025 20:41:16 +0000 (0:00:00.489) 0:03:31.795 ********** 2025-03-23 20:41:16.805582 | orchestrator | skipping: [testbed-manager] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-03-23 20:41:16.834776 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:41:16.835301 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-03-23 20:41:16.835791 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-03-23 20:41:16.863662 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:41:16.894977 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-03-23 20:41:16.896107 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:41:16.920044 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:41:17.528935 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-03-23 20:41:17.529475 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-03-23 20:41:17.529509 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-03-23 20:41:17.530323 | orchestrator | 2025-03-23 20:41:17.531517 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on rabbitmq] **************** 2025-03-23 20:41:17.531543 | orchestrator | Sunday 23 March 2025 20:41:17 +0000 (0:00:00.784) 0:03:32.580 ********** 2025-03-23 20:41:17.570646 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-03-23 20:41:17.571641 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-03-23 20:41:17.571764 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-03-23 20:41:17.629682 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-03-23 20:41:17.629777 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-03-23 20:41:17.629796 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-03-23 20:41:17.629812 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-03-23 20:41:17.629828 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-03-23 20:41:17.629848 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-03-23 20:41:17.631139 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-03-23 20:41:17.631866 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-03-23 20:41:17.632157 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-03-23 20:41:17.633111 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-03-23 20:41:17.635303 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-03-23 20:41:17.635594 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-03-23 20:41:17.635627 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-03-23 20:41:17.680365 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-03-23 20:41:17.681668 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:41:17.681702 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-03-23 20:41:17.682517 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-03-23 20:41:17.683372 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-03-23 20:41:17.683895 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-03-23 20:41:17.684568 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-03-23 20:41:17.685103 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-03-23 20:41:17.685640 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-03-23 20:41:17.744365 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:41:17.744847 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-03-23 20:41:17.745571 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-03-23 20:41:17.746004 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-03-23 20:41:17.746800 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-03-23 20:41:17.747516 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-03-23 20:41:17.748988 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-03-23 20:41:17.749681 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-03-23 20:41:17.750473 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-03-23 20:41:17.753279 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-03-23 20:41:17.753853 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-03-23 20:41:17.754088 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-03-23 20:41:17.754946 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-03-23 20:41:17.757558 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-03-23 20:41:17.791531 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-03-23 20:41:17.792998 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:41:17.822184 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-03-23 20:41:17.822217 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-03-23 20:41:17.822240 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:41:24.563948 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-03-23 20:41:24.564147 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-03-23 20:41:24.564179 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-03-23 20:41:24.564622 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-03-23 20:41:24.566903 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-03-23 20:41:24.567837 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-03-23 20:41:24.568954 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-03-23 20:41:24.569647 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-03-23 20:41:24.571030 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-03-23 20:41:24.572051 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-03-23 20:41:24.573402 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-03-23 20:41:24.573777 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-03-23 20:41:24.574713 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-03-23 20:41:24.575250 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-03-23 20:41:24.576003 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-03-23 20:41:24.576338 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-03-23 20:41:24.576506 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-03-23 20:41:24.577020 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-03-23 20:41:24.577399 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-03-23 20:41:24.577829 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-03-23 20:41:24.578925 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-03-23 20:41:24.579575 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-03-23 20:41:24.579997 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-03-23 20:41:24.580929 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-03-23 20:41:24.581311 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-03-23 20:41:24.582101 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-03-23 20:41:24.582267 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-03-23 20:41:24.582568 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-03-23 20:41:24.583162 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-03-23 20:41:24.584079 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-03-23 20:41:24.584314 | orchestrator | 2025-03-23 20:41:24.585000 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on generic] ***************** 2025-03-23 20:41:24.585538 | orchestrator | Sunday 23 March 2025 20:41:24 +0000 (0:00:07.033) 0:03:39.614 ********** 2025-03-23 20:41:26.056057 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-23 20:41:26.056949 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-23 20:41:26.056985 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-23 20:41:26.058306 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-23 20:41:26.059682 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-23 20:41:26.060893 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-23 20:41:26.061828 | orchestrator | changed: [testbed-manager] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-23 20:41:26.062725 | orchestrator | 2025-03-23 20:41:26.063010 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on compute] ***************** 2025-03-23 20:41:26.063515 | orchestrator | Sunday 23 March 2025 20:41:26 +0000 (0:00:01.491) 0:03:41.106 ********** 2025-03-23 20:41:26.102288 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-03-23 20:41:26.140136 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:41:26.280875 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-03-23 20:41:27.568351 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:41:27.568842 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-03-23 20:41:27.568889 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:41:27.570157 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-03-23 20:41:27.571600 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:41:27.573396 | orchestrator | changed: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-03-23 20:41:27.573828 | orchestrator | changed: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-03-23 20:41:27.574976 | orchestrator | changed: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-03-23 20:41:27.575807 | orchestrator | 2025-03-23 20:41:27.578254 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on k3s_node] **************** 2025-03-23 20:41:27.578649 | orchestrator | Sunday 23 March 2025 20:41:27 +0000 (0:00:01.512) 0:03:42.619 ********** 2025-03-23 20:41:27.605344 | orchestrator | skipping: [testbed-manager] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-03-23 20:41:27.631334 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:41:27.753840 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-03-23 20:41:27.754858 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-03-23 20:41:28.210133 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:41:28.210308 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:41:28.210624 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-03-23 20:41:28.210656 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:41:28.211247 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-03-23 20:41:28.211759 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-03-23 20:41:28.212057 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-03-23 20:41:28.212473 | orchestrator | 2025-03-23 20:41:28.212933 | orchestrator | TASK [osism.commons.limits : Include limits tasks] ***************************** 2025-03-23 20:41:28.213213 | orchestrator | Sunday 23 March 2025 20:41:28 +0000 (0:00:00.642) 0:03:43.261 ********** 2025-03-23 20:41:28.297207 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:41:28.325797 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:41:28.368573 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:41:28.398604 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:41:28.555867 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:41:28.556527 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:41:28.556760 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:41:28.557904 | orchestrator | 2025-03-23 20:41:28.558531 | orchestrator | TASK [osism.commons.services : Populate service facts] ************************* 2025-03-23 20:41:28.559200 | orchestrator | Sunday 23 March 2025 20:41:28 +0000 (0:00:00.345) 0:03:43.607 ********** 2025-03-23 20:41:34.371092 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:41:34.371291 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:41:34.371574 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:41:34.372306 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:41:34.372764 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:41:34.373568 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:41:34.374274 | orchestrator | ok: [testbed-manager] 2025-03-23 20:41:34.374828 | orchestrator | 2025-03-23 20:41:34.376575 | orchestrator | TASK [osism.commons.services : Check services] ********************************* 2025-03-23 20:41:34.376935 | orchestrator | Sunday 23 March 2025 20:41:34 +0000 (0:00:05.815) 0:03:49.422 ********** 2025-03-23 20:41:34.469348 | orchestrator | skipping: [testbed-manager] => (item=nscd)  2025-03-23 20:41:34.469927 | orchestrator | skipping: [testbed-node-3] => (item=nscd)  2025-03-23 20:41:34.519818 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:41:34.520277 | orchestrator | skipping: [testbed-node-4] => (item=nscd)  2025-03-23 20:41:34.579281 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:41:34.579882 | orchestrator | skipping: [testbed-node-5] => (item=nscd)  2025-03-23 20:41:34.628362 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:41:34.628563 | orchestrator | skipping: [testbed-node-0] => (item=nscd)  2025-03-23 20:41:34.667761 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:41:34.668545 | orchestrator | skipping: [testbed-node-1] => (item=nscd)  2025-03-23 20:41:34.763451 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:41:34.764513 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:41:34.764817 | orchestrator | skipping: [testbed-node-2] => (item=nscd)  2025-03-23 20:41:34.765806 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:41:34.766222 | orchestrator | 2025-03-23 20:41:34.767494 | orchestrator | TASK [osism.commons.services : Start/enable required services] ***************** 2025-03-23 20:41:34.768388 | orchestrator | Sunday 23 March 2025 20:41:34 +0000 (0:00:00.391) 0:03:49.815 ********** 2025-03-23 20:41:35.927927 | orchestrator | ok: [testbed-manager] => (item=cron) 2025-03-23 20:41:35.929037 | orchestrator | ok: [testbed-node-3] => (item=cron) 2025-03-23 20:41:35.930224 | orchestrator | ok: [testbed-node-4] => (item=cron) 2025-03-23 20:41:35.932055 | orchestrator | ok: [testbed-node-5] => (item=cron) 2025-03-23 20:41:35.932974 | orchestrator | ok: [testbed-node-1] => (item=cron) 2025-03-23 20:41:35.933735 | orchestrator | ok: [testbed-node-0] => (item=cron) 2025-03-23 20:41:35.934516 | orchestrator | ok: [testbed-node-2] => (item=cron) 2025-03-23 20:41:35.936212 | orchestrator | 2025-03-23 20:41:35.936830 | orchestrator | TASK [osism.commons.motd : Include distribution specific configure tasks] ****** 2025-03-23 20:41:35.936857 | orchestrator | Sunday 23 March 2025 20:41:35 +0000 (0:00:01.161) 0:03:50.976 ********** 2025-03-23 20:41:36.526769 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/motd/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:41:36.528732 | orchestrator | 2025-03-23 20:41:36.530279 | orchestrator | TASK [osism.commons.motd : Remove update-motd package] ************************* 2025-03-23 20:41:36.532203 | orchestrator | Sunday 23 March 2025 20:41:36 +0000 (0:00:00.597) 0:03:51.574 ********** 2025-03-23 20:41:37.980466 | orchestrator | ok: [testbed-manager] 2025-03-23 20:41:37.982492 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:41:37.982538 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:41:37.983503 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:41:37.983530 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:41:37.984755 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:41:37.986182 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:41:37.988912 | orchestrator | 2025-03-23 20:41:37.990405 | orchestrator | TASK [osism.commons.motd : Check if /etc/default/motd-news exists] ************* 2025-03-23 20:41:37.991387 | orchestrator | Sunday 23 March 2025 20:41:37 +0000 (0:00:01.457) 0:03:53.031 ********** 2025-03-23 20:41:38.732628 | orchestrator | ok: [testbed-manager] 2025-03-23 20:41:38.733273 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:41:38.733748 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:41:38.733770 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:41:38.734611 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:41:38.735283 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:41:38.736244 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:41:38.736532 | orchestrator | 2025-03-23 20:41:38.737190 | orchestrator | TASK [osism.commons.motd : Disable the dynamic motd-news service] ************** 2025-03-23 20:41:38.737718 | orchestrator | Sunday 23 March 2025 20:41:38 +0000 (0:00:00.753) 0:03:53.784 ********** 2025-03-23 20:41:39.391861 | orchestrator | changed: [testbed-manager] 2025-03-23 20:41:39.392278 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:41:39.393078 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:41:39.395551 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:41:39.396486 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:41:39.396525 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:41:39.396700 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:41:39.397293 | orchestrator | 2025-03-23 20:41:39.397779 | orchestrator | TASK [osism.commons.motd : Get all configuration files in /etc/pam.d] ********** 2025-03-23 20:41:39.398102 | orchestrator | Sunday 23 March 2025 20:41:39 +0000 (0:00:00.657) 0:03:54.442 ********** 2025-03-23 20:41:40.092378 | orchestrator | ok: [testbed-manager] 2025-03-23 20:41:40.092870 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:41:40.093295 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:41:40.094002 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:41:40.094630 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:41:40.094801 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:41:40.095322 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:41:40.097253 | orchestrator | 2025-03-23 20:41:40.101763 | orchestrator | TASK [osism.commons.motd : Remove pam_motd.so rule] **************************** 2025-03-23 20:41:40.104420 | orchestrator | Sunday 23 March 2025 20:41:40 +0000 (0:00:00.700) 0:03:55.142 ********** 2025-03-23 20:41:41.269893 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742760752.7155647, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.273573 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742760750.8070583, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.273614 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742760757.8071737, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.274864 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742760767.7834146, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.274915 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742760752.4198182, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.274932 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742760760.0276322, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.274947 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742760758.7131867, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.274967 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742760785.117652, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.275564 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742760692.977968, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.276399 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742760693.9060361, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.276966 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742760707.975933, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.277468 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742760701.2633028, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.278838 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742760702.4964294, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.279341 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742760698.3369174, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-23 20:41:41.281467 | orchestrator | 2025-03-23 20:41:41.281745 | orchestrator | TASK [osism.commons.motd : Copy motd file] ************************************* 2025-03-23 20:41:41.281964 | orchestrator | Sunday 23 March 2025 20:41:41 +0000 (0:00:01.176) 0:03:56.319 ********** 2025-03-23 20:41:42.486547 | orchestrator | changed: [testbed-manager] 2025-03-23 20:41:42.487194 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:41:42.488694 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:41:42.489003 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:41:42.489734 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:41:42.490128 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:41:42.490780 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:41:42.491632 | orchestrator | 2025-03-23 20:41:42.491849 | orchestrator | TASK [osism.commons.motd : Copy issue file] ************************************ 2025-03-23 20:41:43.814669 | orchestrator | Sunday 23 March 2025 20:41:42 +0000 (0:00:01.218) 0:03:57.537 ********** 2025-03-23 20:41:43.814801 | orchestrator | changed: [testbed-manager] 2025-03-23 20:41:43.815790 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:41:43.818629 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:41:43.818731 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:41:43.819897 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:41:43.820606 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:41:43.821573 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:41:43.821947 | orchestrator | 2025-03-23 20:41:43.822421 | orchestrator | TASK [osism.commons.motd : Configure SSH to print the motd] ******************** 2025-03-23 20:41:43.823124 | orchestrator | Sunday 23 March 2025 20:41:43 +0000 (0:00:01.326) 0:03:58.864 ********** 2025-03-23 20:41:43.936337 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:41:43.972840 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:41:44.029047 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:41:44.061178 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:41:44.123661 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:41:44.124461 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:41:44.125724 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:41:44.126839 | orchestrator | 2025-03-23 20:41:44.128050 | orchestrator | TASK [osism.commons.motd : Configure SSH to not print the motd] **************** 2025-03-23 20:41:44.128515 | orchestrator | Sunday 23 March 2025 20:41:44 +0000 (0:00:00.311) 0:03:59.176 ********** 2025-03-23 20:41:44.941664 | orchestrator | ok: [testbed-manager] 2025-03-23 20:41:44.942853 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:41:44.944140 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:41:44.946324 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:41:44.947294 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:41:44.948258 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:41:44.949625 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:41:44.950638 | orchestrator | 2025-03-23 20:41:44.951222 | orchestrator | TASK [osism.services.rng : Include distribution specific install tasks] ******** 2025-03-23 20:41:44.952208 | orchestrator | Sunday 23 March 2025 20:41:44 +0000 (0:00:00.816) 0:03:59.992 ********** 2025-03-23 20:41:45.408878 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rng/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:41:45.409079 | orchestrator | 2025-03-23 20:41:45.410257 | orchestrator | TASK [osism.services.rng : Install rng package] ******************************** 2025-03-23 20:41:45.411773 | orchestrator | Sunday 23 March 2025 20:41:45 +0000 (0:00:00.466) 0:04:00.458 ********** 2025-03-23 20:41:54.776148 | orchestrator | ok: [testbed-manager] 2025-03-23 20:41:54.776464 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:41:54.776972 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:41:54.780847 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:41:54.781426 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:41:54.781685 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:41:54.783061 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:41:54.783474 | orchestrator | 2025-03-23 20:41:54.784415 | orchestrator | TASK [osism.services.rng : Remove haveged package] ***************************** 2025-03-23 20:41:54.784510 | orchestrator | Sunday 23 March 2025 20:41:54 +0000 (0:00:09.368) 0:04:09.826 ********** 2025-03-23 20:41:56.270901 | orchestrator | ok: [testbed-manager] 2025-03-23 20:41:56.271813 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:41:56.271853 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:41:56.272280 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:41:56.273036 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:41:56.275146 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:41:56.275643 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:41:56.276192 | orchestrator | 2025-03-23 20:41:56.276423 | orchestrator | TASK [osism.services.rng : Manage rng service] ********************************* 2025-03-23 20:41:56.277042 | orchestrator | Sunday 23 March 2025 20:41:56 +0000 (0:00:01.493) 0:04:11.319 ********** 2025-03-23 20:41:57.424402 | orchestrator | ok: [testbed-manager] 2025-03-23 20:41:57.424602 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:41:57.425467 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:41:57.426105 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:41:57.427251 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:41:57.428558 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:41:57.429223 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:41:57.430117 | orchestrator | 2025-03-23 20:41:57.430905 | orchestrator | TASK [osism.services.smartd : Include distribution specific install tasks] ***** 2025-03-23 20:41:57.431584 | orchestrator | Sunday 23 March 2025 20:41:57 +0000 (0:00:01.155) 0:04:12.475 ********** 2025-03-23 20:41:57.895414 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/smartd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:41:57.895676 | orchestrator | 2025-03-23 20:41:57.898678 | orchestrator | TASK [osism.services.smartd : Install smartmontools package] ******************* 2025-03-23 20:42:07.567873 | orchestrator | Sunday 23 March 2025 20:41:57 +0000 (0:00:00.469) 0:04:12.945 ********** 2025-03-23 20:42:07.568079 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:42:07.571677 | orchestrator | changed: [testbed-manager] 2025-03-23 20:42:07.571718 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:42:07.573342 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:42:07.573644 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:42:07.573670 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:42:07.573690 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:42:07.574077 | orchestrator | 2025-03-23 20:42:07.577571 | orchestrator | TASK [osism.services.smartd : Create /var/log/smartd directory] **************** 2025-03-23 20:42:08.256705 | orchestrator | Sunday 23 March 2025 20:42:07 +0000 (0:00:09.668) 0:04:22.613 ********** 2025-03-23 20:42:08.256840 | orchestrator | changed: [testbed-manager] 2025-03-23 20:42:08.258877 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:42:08.259190 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:42:08.259221 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:42:08.259243 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:42:08.260263 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:42:08.260995 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:42:08.263920 | orchestrator | 2025-03-23 20:42:08.265572 | orchestrator | TASK [osism.services.smartd : Copy smartmontools configuration file] *********** 2025-03-23 20:42:08.265897 | orchestrator | Sunday 23 March 2025 20:42:08 +0000 (0:00:00.694) 0:04:23.308 ********** 2025-03-23 20:42:09.490312 | orchestrator | changed: [testbed-manager] 2025-03-23 20:42:09.490537 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:42:09.491586 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:42:09.492942 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:42:09.493178 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:42:09.498638 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:42:10.695314 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:42:10.695488 | orchestrator | 2025-03-23 20:42:10.695512 | orchestrator | TASK [osism.services.smartd : Manage smartd service] *************************** 2025-03-23 20:42:10.695529 | orchestrator | Sunday 23 March 2025 20:42:09 +0000 (0:00:01.233) 0:04:24.541 ********** 2025-03-23 20:42:10.695560 | orchestrator | changed: [testbed-manager] 2025-03-23 20:42:10.697649 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:42:10.697708 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:42:10.699377 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:42:10.699832 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:42:10.700102 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:42:10.700462 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:42:10.701005 | orchestrator | 2025-03-23 20:42:10.702482 | orchestrator | TASK [osism.commons.cleanup : Gather variables for each operating system] ****** 2025-03-23 20:42:10.703671 | orchestrator | Sunday 23 March 2025 20:42:10 +0000 (0:00:01.199) 0:04:25.741 ********** 2025-03-23 20:42:10.829244 | orchestrator | ok: [testbed-manager] 2025-03-23 20:42:10.914690 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:42:10.973657 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:42:11.009965 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:42:11.090948 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:42:11.092257 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:42:11.093028 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:42:11.093290 | orchestrator | 2025-03-23 20:42:11.096593 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_packages_distribution variable to default value] *** 2025-03-23 20:42:11.206947 | orchestrator | Sunday 23 March 2025 20:42:11 +0000 (0:00:00.401) 0:04:26.142 ********** 2025-03-23 20:42:11.207015 | orchestrator | ok: [testbed-manager] 2025-03-23 20:42:11.254260 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:42:11.296251 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:42:11.331292 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:42:11.429094 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:42:11.430255 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:42:11.430706 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:42:11.431609 | orchestrator | 2025-03-23 20:42:11.431774 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_services_distribution variable to default value] *** 2025-03-23 20:42:11.432565 | orchestrator | Sunday 23 March 2025 20:42:11 +0000 (0:00:00.338) 0:04:26.481 ********** 2025-03-23 20:42:11.546993 | orchestrator | ok: [testbed-manager] 2025-03-23 20:42:11.632085 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:42:11.670281 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:42:11.717655 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:42:11.804949 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:42:11.806384 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:42:11.806999 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:42:11.807767 | orchestrator | 2025-03-23 20:42:11.808601 | orchestrator | TASK [osism.commons.cleanup : Populate service facts] ************************** 2025-03-23 20:42:11.809352 | orchestrator | Sunday 23 March 2025 20:42:11 +0000 (0:00:00.376) 0:04:26.858 ********** 2025-03-23 20:42:16.490012 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:42:16.491329 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:42:16.491364 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:42:16.491791 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:42:16.492478 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:42:16.493105 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:42:16.493561 | orchestrator | ok: [testbed-manager] 2025-03-23 20:42:16.494272 | orchestrator | 2025-03-23 20:42:16.494912 | orchestrator | TASK [osism.commons.cleanup : Include distribution specific timer tasks] ******* 2025-03-23 20:42:16.495528 | orchestrator | Sunday 23 March 2025 20:42:16 +0000 (0:00:04.682) 0:04:31.541 ********** 2025-03-23 20:42:16.940116 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/timers-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:42:16.941578 | orchestrator | 2025-03-23 20:42:16.942192 | orchestrator | TASK [osism.commons.cleanup : Disable apt-daily timers] ************************ 2025-03-23 20:42:16.943648 | orchestrator | Sunday 23 March 2025 20:42:16 +0000 (0:00:00.450) 0:04:31.991 ********** 2025-03-23 20:42:16.980619 | orchestrator | skipping: [testbed-manager] => (item=apt-daily-upgrade)  2025-03-23 20:42:17.027282 | orchestrator | skipping: [testbed-manager] => (item=apt-daily)  2025-03-23 20:42:17.028408 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily-upgrade)  2025-03-23 20:42:17.072145 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily)  2025-03-23 20:42:17.073355 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:42:17.073545 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily-upgrade)  2025-03-23 20:42:17.074214 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily)  2025-03-23 20:42:17.126884 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:42:17.127205 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily-upgrade)  2025-03-23 20:42:17.127233 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily)  2025-03-23 20:42:17.171685 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:42:17.171744 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily-upgrade)  2025-03-23 20:42:17.172508 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily)  2025-03-23 20:42:17.214081 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:42:17.214736 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily-upgrade)  2025-03-23 20:42:17.217314 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily)  2025-03-23 20:42:17.292688 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:42:17.293781 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:42:17.294696 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily-upgrade)  2025-03-23 20:42:17.296463 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily)  2025-03-23 20:42:17.297152 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:42:17.297175 | orchestrator | 2025-03-23 20:42:17.297716 | orchestrator | TASK [osism.commons.cleanup : Include service tasks] *************************** 2025-03-23 20:42:17.298670 | orchestrator | Sunday 23 March 2025 20:42:17 +0000 (0:00:00.354) 0:04:32.345 ********** 2025-03-23 20:42:17.785900 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/services-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:42:17.787194 | orchestrator | 2025-03-23 20:42:17.787799 | orchestrator | TASK [osism.commons.cleanup : Cleanup services] ******************************** 2025-03-23 20:42:17.787832 | orchestrator | Sunday 23 March 2025 20:42:17 +0000 (0:00:00.489) 0:04:32.835 ********** 2025-03-23 20:42:17.826130 | orchestrator | skipping: [testbed-manager] => (item=ModemManager.service)  2025-03-23 20:42:17.875053 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:42:17.875284 | orchestrator | skipping: [testbed-node-3] => (item=ModemManager.service)  2025-03-23 20:42:17.918293 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:42:17.959794 | orchestrator | skipping: [testbed-node-4] => (item=ModemManager.service)  2025-03-23 20:42:17.999478 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:42:18.049796 | orchestrator | skipping: [testbed-node-5] => (item=ModemManager.service)  2025-03-23 20:42:18.049866 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:42:18.049974 | orchestrator | skipping: [testbed-node-0] => (item=ModemManager.service)  2025-03-23 20:42:18.126738 | orchestrator | skipping: [testbed-node-1] => (item=ModemManager.service)  2025-03-23 20:42:18.127330 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:42:18.127375 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:42:18.127719 | orchestrator | skipping: [testbed-node-2] => (item=ModemManager.service)  2025-03-23 20:42:18.128574 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:42:18.128666 | orchestrator | 2025-03-23 20:42:18.129640 | orchestrator | TASK [osism.commons.cleanup : Include packages tasks] ************************** 2025-03-23 20:42:18.131142 | orchestrator | Sunday 23 March 2025 20:42:18 +0000 (0:00:00.344) 0:04:33.180 ********** 2025-03-23 20:42:18.595118 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/packages-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:42:18.596096 | orchestrator | 2025-03-23 20:42:18.597123 | orchestrator | TASK [osism.commons.cleanup : Cleanup installed packages] ********************** 2025-03-23 20:42:18.599164 | orchestrator | Sunday 23 March 2025 20:42:18 +0000 (0:00:00.465) 0:04:33.645 ********** 2025-03-23 20:42:52.365765 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:42:52.366097 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:42:52.366425 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:42:52.366480 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:42:52.366497 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:42:52.366517 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:42:52.368676 | orchestrator | changed: [testbed-manager] 2025-03-23 20:42:52.369856 | orchestrator | 2025-03-23 20:42:52.372171 | orchestrator | TASK [osism.commons.cleanup : Remove cloudinit package] ************************ 2025-03-23 20:42:52.373313 | orchestrator | Sunday 23 March 2025 20:42:52 +0000 (0:00:33.766) 0:05:07.411 ********** 2025-03-23 20:43:01.314504 | orchestrator | changed: [testbed-manager] 2025-03-23 20:43:01.315782 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:01.316427 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:01.317072 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:01.319550 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:01.319877 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:01.319903 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:01.320639 | orchestrator | 2025-03-23 20:43:01.321111 | orchestrator | TASK [osism.commons.cleanup : Uninstall unattended-upgrades package] *********** 2025-03-23 20:43:01.321803 | orchestrator | Sunday 23 March 2025 20:43:01 +0000 (0:00:08.952) 0:05:16.364 ********** 2025-03-23 20:43:10.049580 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:10.049807 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:10.050203 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:10.051164 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:10.051961 | orchestrator | changed: [testbed-manager] 2025-03-23 20:43:10.053795 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:10.054246 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:10.054595 | orchestrator | 2025-03-23 20:43:10.055090 | orchestrator | TASK [osism.commons.cleanup : Remove useless packages from the cache] ********** 2025-03-23 20:43:10.055633 | orchestrator | Sunday 23 March 2025 20:43:10 +0000 (0:00:08.735) 0:05:25.099 ********** 2025-03-23 20:43:12.008675 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:43:12.008843 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:43:12.008864 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:43:12.008879 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:43:12.008898 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:43:12.009871 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:12.009905 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:43:12.010098 | orchestrator | 2025-03-23 20:43:12.010126 | orchestrator | TASK [osism.commons.cleanup : Remove dependencies that are no longer required] *** 2025-03-23 20:43:12.010148 | orchestrator | Sunday 23 March 2025 20:43:11 +0000 (0:00:01.955) 0:05:27.055 ********** 2025-03-23 20:43:18.616006 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:18.616492 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:18.616535 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:18.616904 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:18.617305 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:18.618193 | orchestrator | changed: [testbed-manager] 2025-03-23 20:43:18.618375 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:18.619926 | orchestrator | 2025-03-23 20:43:18.619993 | orchestrator | TASK [osism.commons.cleanup : Include cloudinit tasks] ************************* 2025-03-23 20:43:18.620923 | orchestrator | Sunday 23 March 2025 20:43:18 +0000 (0:00:06.609) 0:05:33.665 ********** 2025-03-23 20:43:19.111363 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/cloudinit.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:43:19.111544 | orchestrator | 2025-03-23 20:43:19.112002 | orchestrator | TASK [osism.commons.cleanup : Remove cloud-init configuration directory] ******* 2025-03-23 20:43:19.112285 | orchestrator | Sunday 23 March 2025 20:43:19 +0000 (0:00:00.496) 0:05:34.161 ********** 2025-03-23 20:43:19.928968 | orchestrator | changed: [testbed-manager] 2025-03-23 20:43:19.929112 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:19.929738 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:19.933849 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:19.933925 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:19.933943 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:19.933958 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:19.933977 | orchestrator | 2025-03-23 20:43:19.934605 | orchestrator | TASK [osism.commons.timezone : Install tzdata package] ************************* 2025-03-23 20:43:19.935487 | orchestrator | Sunday 23 March 2025 20:43:19 +0000 (0:00:00.817) 0:05:34.979 ********** 2025-03-23 20:43:21.754225 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:21.754600 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:43:21.757060 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:43:21.758437 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:43:21.761448 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:43:21.761500 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:43:21.761516 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:43:21.761548 | orchestrator | 2025-03-23 20:43:21.761569 | orchestrator | TASK [osism.commons.timezone : Set timezone to UTC] **************************** 2025-03-23 20:43:21.762368 | orchestrator | Sunday 23 March 2025 20:43:21 +0000 (0:00:01.824) 0:05:36.804 ********** 2025-03-23 20:43:22.611653 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:22.612029 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:22.612509 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:22.613459 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:22.613854 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:22.614745 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:22.615713 | orchestrator | changed: [testbed-manager] 2025-03-23 20:43:22.615782 | orchestrator | 2025-03-23 20:43:22.616005 | orchestrator | TASK [osism.commons.timezone : Create /etc/adjtime file] *********************** 2025-03-23 20:43:22.617209 | orchestrator | Sunday 23 March 2025 20:43:22 +0000 (0:00:00.857) 0:05:37.661 ********** 2025-03-23 20:43:22.690704 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:43:22.723762 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:43:22.811676 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:43:22.846458 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:43:22.916111 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:43:22.917497 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:43:22.917988 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:43:22.918811 | orchestrator | 2025-03-23 20:43:22.919258 | orchestrator | TASK [osism.commons.timezone : Ensure UTC in /etc/adjtime] ********************* 2025-03-23 20:43:22.921616 | orchestrator | Sunday 23 March 2025 20:43:22 +0000 (0:00:00.305) 0:05:37.967 ********** 2025-03-23 20:43:23.009763 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:43:23.054196 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:43:23.096260 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:43:23.133396 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:43:23.171925 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:43:23.379697 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:43:23.380327 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:43:23.383143 | orchestrator | 2025-03-23 20:43:23.383783 | orchestrator | TASK [osism.services.docker : Gather variables for each operating system] ****** 2025-03-23 20:43:23.385050 | orchestrator | Sunday 23 March 2025 20:43:23 +0000 (0:00:00.462) 0:05:38.430 ********** 2025-03-23 20:43:23.513434 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:23.550638 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:43:23.593691 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:43:23.641136 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:43:23.719900 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:43:23.720006 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:43:23.721355 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:43:23.723104 | orchestrator | 2025-03-23 20:43:23.723984 | orchestrator | TASK [osism.services.docker : Set docker_version variable to default value] **** 2025-03-23 20:43:23.725155 | orchestrator | Sunday 23 March 2025 20:43:23 +0000 (0:00:00.342) 0:05:38.772 ********** 2025-03-23 20:43:23.823814 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:43:23.858650 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:43:23.917518 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:43:23.966251 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:43:24.049198 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:43:24.049851 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:43:24.050450 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:43:24.052827 | orchestrator | 2025-03-23 20:43:24.053035 | orchestrator | TASK [osism.services.docker : Set docker_cli_version variable to default value] *** 2025-03-23 20:43:24.053935 | orchestrator | Sunday 23 March 2025 20:43:24 +0000 (0:00:00.327) 0:05:39.100 ********** 2025-03-23 20:43:24.165362 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:24.215422 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:43:24.257545 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:43:24.309874 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:43:24.382235 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:43:24.382384 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:43:24.382944 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:43:24.383034 | orchestrator | 2025-03-23 20:43:24.383132 | orchestrator | TASK [osism.services.docker : Include block storage tasks] ********************* 2025-03-23 20:43:24.383394 | orchestrator | Sunday 23 March 2025 20:43:24 +0000 (0:00:00.334) 0:05:39.435 ********** 2025-03-23 20:43:24.517531 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:43:24.567826 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:43:24.608808 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:43:24.648753 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:43:24.716833 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:43:24.718280 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:43:24.719540 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:43:24.720177 | orchestrator | 2025-03-23 20:43:24.720947 | orchestrator | TASK [osism.services.docker : Include zram storage tasks] ********************** 2025-03-23 20:43:24.721859 | orchestrator | Sunday 23 March 2025 20:43:24 +0000 (0:00:00.333) 0:05:39.768 ********** 2025-03-23 20:43:24.796764 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:43:24.838308 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:43:24.876274 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:43:24.911047 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:43:24.951366 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:43:25.017906 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:43:25.018955 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:43:25.019174 | orchestrator | 2025-03-23 20:43:25.020120 | orchestrator | TASK [osism.services.docker : Include docker install tasks] ******************** 2025-03-23 20:43:25.021000 | orchestrator | Sunday 23 March 2025 20:43:25 +0000 (0:00:00.301) 0:05:40.069 ********** 2025-03-23 20:43:25.683449 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/install-docker-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:43:25.683948 | orchestrator | 2025-03-23 20:43:25.684869 | orchestrator | TASK [osism.services.docker : Remove old architecture-dependent repository] **** 2025-03-23 20:43:25.685737 | orchestrator | Sunday 23 March 2025 20:43:25 +0000 (0:00:00.665) 0:05:40.735 ********** 2025-03-23 20:43:26.705185 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:26.706353 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:43:26.707731 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:43:26.708771 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:43:26.709655 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:43:26.710585 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:43:26.711434 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:43:26.711623 | orchestrator | 2025-03-23 20:43:26.712111 | orchestrator | TASK [osism.services.docker : Gather package facts] **************************** 2025-03-23 20:43:26.712670 | orchestrator | Sunday 23 March 2025 20:43:26 +0000 (0:00:01.018) 0:05:41.753 ********** 2025-03-23 20:43:30.622550 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:43:30.623840 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:43:30.624052 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:43:30.625224 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:43:30.627908 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:43:30.628385 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:43:30.628410 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:30.628424 | orchestrator | 2025-03-23 20:43:30.628441 | orchestrator | TASK [osism.services.docker : Check whether packages are installed that should not be installed] *** 2025-03-23 20:43:30.628460 | orchestrator | Sunday 23 March 2025 20:43:30 +0000 (0:00:03.919) 0:05:45.673 ********** 2025-03-23 20:43:30.699119 | orchestrator | skipping: [testbed-manager] => (item=containerd)  2025-03-23 20:43:30.809652 | orchestrator | skipping: [testbed-manager] => (item=docker.io)  2025-03-23 20:43:30.810445 | orchestrator | skipping: [testbed-manager] => (item=docker-engine)  2025-03-23 20:43:30.811614 | orchestrator | skipping: [testbed-node-3] => (item=containerd)  2025-03-23 20:43:30.811995 | orchestrator | skipping: [testbed-node-3] => (item=docker.io)  2025-03-23 20:43:30.812403 | orchestrator | skipping: [testbed-node-3] => (item=docker-engine)  2025-03-23 20:43:30.903572 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:43:30.903905 | orchestrator | skipping: [testbed-node-4] => (item=containerd)  2025-03-23 20:43:30.904264 | orchestrator | skipping: [testbed-node-4] => (item=docker.io)  2025-03-23 20:43:30.904290 | orchestrator | skipping: [testbed-node-4] => (item=docker-engine)  2025-03-23 20:43:30.988607 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:43:30.988990 | orchestrator | skipping: [testbed-node-5] => (item=containerd)  2025-03-23 20:43:30.990216 | orchestrator | skipping: [testbed-node-5] => (item=docker.io)  2025-03-23 20:43:30.991057 | orchestrator | skipping: [testbed-node-5] => (item=docker-engine)  2025-03-23 20:43:31.088516 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:43:31.088986 | orchestrator | skipping: [testbed-node-0] => (item=containerd)  2025-03-23 20:43:31.089023 | orchestrator | skipping: [testbed-node-0] => (item=docker.io)  2025-03-23 20:43:31.090220 | orchestrator | skipping: [testbed-node-0] => (item=docker-engine)  2025-03-23 20:43:31.176147 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:43:31.177599 | orchestrator | skipping: [testbed-node-1] => (item=containerd)  2025-03-23 20:43:31.178217 | orchestrator | skipping: [testbed-node-1] => (item=docker.io)  2025-03-23 20:43:31.179455 | orchestrator | skipping: [testbed-node-1] => (item=docker-engine)  2025-03-23 20:43:31.316522 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:43:31.316904 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:43:31.318454 | orchestrator | skipping: [testbed-node-2] => (item=containerd)  2025-03-23 20:43:31.319166 | orchestrator | skipping: [testbed-node-2] => (item=docker.io)  2025-03-23 20:43:31.320241 | orchestrator | skipping: [testbed-node-2] => (item=docker-engine)  2025-03-23 20:43:31.320780 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:43:31.321397 | orchestrator | 2025-03-23 20:43:31.322164 | orchestrator | TASK [osism.services.docker : Install apt-transport-https package] ************* 2025-03-23 20:43:31.322622 | orchestrator | Sunday 23 March 2025 20:43:31 +0000 (0:00:00.694) 0:05:46.367 ********** 2025-03-23 20:43:38.965416 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:38.965691 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:38.965726 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:38.965932 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:38.965959 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:38.966087 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:38.966372 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:38.966619 | orchestrator | 2025-03-23 20:43:38.966693 | orchestrator | TASK [osism.services.docker : Add repository gpg key] ************************** 2025-03-23 20:43:38.967013 | orchestrator | Sunday 23 March 2025 20:43:38 +0000 (0:00:07.648) 0:05:54.015 ********** 2025-03-23 20:43:40.174265 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:40.174586 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:40.175561 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:40.176337 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:40.177033 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:40.177998 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:40.178708 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:40.179068 | orchestrator | 2025-03-23 20:43:40.179876 | orchestrator | TASK [osism.services.docker : Add repository] ********************************** 2025-03-23 20:43:40.179981 | orchestrator | Sunday 23 March 2025 20:43:40 +0000 (0:00:01.208) 0:05:55.224 ********** 2025-03-23 20:43:49.156732 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:49.157374 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:49.161557 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:49.163977 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:49.164011 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:49.164024 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:49.164037 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:49.164055 | orchestrator | 2025-03-23 20:43:49.164304 | orchestrator | TASK [osism.services.docker : Update package cache] **************************** 2025-03-23 20:43:49.164999 | orchestrator | Sunday 23 March 2025 20:43:49 +0000 (0:00:08.980) 0:06:04.204 ********** 2025-03-23 20:43:52.620919 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:52.621798 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:52.622799 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:52.624133 | orchestrator | changed: [testbed-manager] 2025-03-23 20:43:52.625805 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:52.626118 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:52.626759 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:52.627716 | orchestrator | 2025-03-23 20:43:52.628629 | orchestrator | TASK [osism.services.docker : Pin docker package version] ********************** 2025-03-23 20:43:52.629085 | orchestrator | Sunday 23 March 2025 20:43:52 +0000 (0:00:03.466) 0:06:07.671 ********** 2025-03-23 20:43:54.239379 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:54.243210 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:54.243255 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:54.244922 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:54.244951 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:54.244971 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:54.246217 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:54.247865 | orchestrator | 2025-03-23 20:43:54.248437 | orchestrator | TASK [osism.services.docker : Pin docker-cli package version] ****************** 2025-03-23 20:43:54.249515 | orchestrator | Sunday 23 March 2025 20:43:54 +0000 (0:00:01.617) 0:06:09.288 ********** 2025-03-23 20:43:55.741032 | orchestrator | ok: [testbed-manager] 2025-03-23 20:43:55.743175 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:43:55.743216 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:43:55.744854 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:43:55.747526 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:43:55.748394 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:43:55.749918 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:43:55.750620 | orchestrator | 2025-03-23 20:43:55.750828 | orchestrator | TASK [osism.services.docker : Unlock containerd package] *********************** 2025-03-23 20:43:55.752280 | orchestrator | Sunday 23 March 2025 20:43:55 +0000 (0:00:01.500) 0:06:10.789 ********** 2025-03-23 20:43:55.975522 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:43:56.064363 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:43:56.162196 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:43:56.236255 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:43:56.468403 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:43:56.469309 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:43:56.469953 | orchestrator | changed: [testbed-manager] 2025-03-23 20:43:56.470755 | orchestrator | 2025-03-23 20:43:56.472667 | orchestrator | TASK [osism.services.docker : Install containerd package] ********************** 2025-03-23 20:43:56.473825 | orchestrator | Sunday 23 March 2025 20:43:56 +0000 (0:00:00.728) 0:06:11.518 ********** 2025-03-23 20:44:07.542361 | orchestrator | ok: [testbed-manager] 2025-03-23 20:44:07.542605 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:44:07.542636 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:44:07.542651 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:44:07.542672 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:44:07.542773 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:44:07.543164 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:44:07.544119 | orchestrator | 2025-03-23 20:44:07.544817 | orchestrator | TASK [osism.services.docker : Lock containerd package] ************************* 2025-03-23 20:44:07.545922 | orchestrator | Sunday 23 March 2025 20:44:07 +0000 (0:00:11.065) 0:06:22.583 ********** 2025-03-23 20:44:08.574875 | orchestrator | changed: [testbed-manager] 2025-03-23 20:44:08.575044 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:44:08.575071 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:44:08.575883 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:44:08.576408 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:44:08.576714 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:44:08.577760 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:44:08.578130 | orchestrator | 2025-03-23 20:44:08.579345 | orchestrator | TASK [osism.services.docker : Install docker-cli package] ********************** 2025-03-23 20:44:08.580615 | orchestrator | Sunday 23 March 2025 20:44:08 +0000 (0:00:01.039) 0:06:23.623 ********** 2025-03-23 20:44:22.472907 | orchestrator | ok: [testbed-manager] 2025-03-23 20:44:22.473190 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:44:22.474092 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:44:22.474128 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:44:22.474640 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:44:22.474665 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:44:22.474685 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:44:22.475340 | orchestrator | 2025-03-23 20:44:22.475995 | orchestrator | TASK [osism.services.docker : Install docker package] ************************** 2025-03-23 20:44:22.476608 | orchestrator | Sunday 23 March 2025 20:44:22 +0000 (0:00:13.891) 0:06:37.514 ********** 2025-03-23 20:44:35.928723 | orchestrator | ok: [testbed-manager] 2025-03-23 20:44:35.928956 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:44:35.928988 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:44:35.929011 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:44:35.931971 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:44:35.932947 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:44:35.933818 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:44:35.936787 | orchestrator | 2025-03-23 20:44:35.938738 | orchestrator | TASK [osism.services.docker : Unblock installation of python docker packages] *** 2025-03-23 20:44:35.939154 | orchestrator | Sunday 23 March 2025 20:44:35 +0000 (0:00:13.462) 0:06:50.977 ********** 2025-03-23 20:44:36.383836 | orchestrator | ok: [testbed-manager] => (item=python3-docker) 2025-03-23 20:44:37.211583 | orchestrator | ok: [testbed-node-3] => (item=python3-docker) 2025-03-23 20:44:37.211953 | orchestrator | ok: [testbed-node-4] => (item=python3-docker) 2025-03-23 20:44:37.212637 | orchestrator | ok: [testbed-node-5] => (item=python3-docker) 2025-03-23 20:44:37.213758 | orchestrator | ok: [testbed-manager] => (item=python-docker) 2025-03-23 20:44:37.215907 | orchestrator | ok: [testbed-node-0] => (item=python3-docker) 2025-03-23 20:44:37.216241 | orchestrator | ok: [testbed-node-1] => (item=python3-docker) 2025-03-23 20:44:37.216484 | orchestrator | ok: [testbed-node-2] => (item=python3-docker) 2025-03-23 20:44:37.217615 | orchestrator | ok: [testbed-node-3] => (item=python-docker) 2025-03-23 20:44:37.218446 | orchestrator | ok: [testbed-node-4] => (item=python-docker) 2025-03-23 20:44:37.219052 | orchestrator | ok: [testbed-node-5] => (item=python-docker) 2025-03-23 20:44:37.220074 | orchestrator | ok: [testbed-node-1] => (item=python-docker) 2025-03-23 20:44:37.220700 | orchestrator | ok: [testbed-node-0] => (item=python-docker) 2025-03-23 20:44:37.221253 | orchestrator | ok: [testbed-node-2] => (item=python-docker) 2025-03-23 20:44:37.222186 | orchestrator | 2025-03-23 20:44:37.222673 | orchestrator | TASK [osism.services.docker : Install python3 docker package] ****************** 2025-03-23 20:44:37.223535 | orchestrator | Sunday 23 March 2025 20:44:37 +0000 (0:00:01.282) 0:06:52.260 ********** 2025-03-23 20:44:37.361272 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:44:37.431363 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:44:37.524587 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:44:37.613964 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:44:37.689227 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:44:37.824727 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:44:37.825358 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:44:37.825391 | orchestrator | 2025-03-23 20:44:37.825725 | orchestrator | TASK [osism.services.docker : Install python3 docker package from Debian Sid] *** 2025-03-23 20:44:37.826130 | orchestrator | Sunday 23 March 2025 20:44:37 +0000 (0:00:00.615) 0:06:52.875 ********** 2025-03-23 20:44:42.060812 | orchestrator | ok: [testbed-manager] 2025-03-23 20:44:42.061007 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:44:42.061525 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:44:42.063123 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:44:42.063866 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:44:42.067791 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:44:42.068442 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:44:42.068856 | orchestrator | 2025-03-23 20:44:42.071050 | orchestrator | TASK [osism.services.docker : Remove python docker packages (install python bindings from pip)] *** 2025-03-23 20:44:42.071655 | orchestrator | Sunday 23 March 2025 20:44:42 +0000 (0:00:04.235) 0:06:57.110 ********** 2025-03-23 20:44:42.210336 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:44:42.291432 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:44:42.591783 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:44:42.663758 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:44:42.729587 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:44:42.862960 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:44:42.863137 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:44:42.863633 | orchestrator | 2025-03-23 20:44:42.864179 | orchestrator | TASK [osism.services.docker : Block installation of python docker packages (install python bindings from pip)] *** 2025-03-23 20:44:42.864539 | orchestrator | Sunday 23 March 2025 20:44:42 +0000 (0:00:00.803) 0:06:57.914 ********** 2025-03-23 20:44:42.938332 | orchestrator | skipping: [testbed-manager] => (item=python3-docker)  2025-03-23 20:44:42.938794 | orchestrator | skipping: [testbed-manager] => (item=python-docker)  2025-03-23 20:44:43.017621 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:44:43.019045 | orchestrator | skipping: [testbed-node-3] => (item=python3-docker)  2025-03-23 20:44:43.019637 | orchestrator | skipping: [testbed-node-3] => (item=python-docker)  2025-03-23 20:44:43.112390 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:44:43.112528 | orchestrator | skipping: [testbed-node-4] => (item=python3-docker)  2025-03-23 20:44:43.113461 | orchestrator | skipping: [testbed-node-4] => (item=python-docker)  2025-03-23 20:44:43.183997 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:44:43.184842 | orchestrator | skipping: [testbed-node-5] => (item=python3-docker)  2025-03-23 20:44:43.185841 | orchestrator | skipping: [testbed-node-5] => (item=python-docker)  2025-03-23 20:44:43.282667 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:44:43.283584 | orchestrator | skipping: [testbed-node-0] => (item=python3-docker)  2025-03-23 20:44:43.284120 | orchestrator | skipping: [testbed-node-0] => (item=python-docker)  2025-03-23 20:44:43.362329 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:44:43.363892 | orchestrator | skipping: [testbed-node-1] => (item=python3-docker)  2025-03-23 20:44:43.365143 | orchestrator | skipping: [testbed-node-1] => (item=python-docker)  2025-03-23 20:44:43.484941 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:44:43.485413 | orchestrator | skipping: [testbed-node-2] => (item=python3-docker)  2025-03-23 20:44:43.485448 | orchestrator | skipping: [testbed-node-2] => (item=python-docker)  2025-03-23 20:44:43.487859 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:44:43.490257 | orchestrator | 2025-03-23 20:44:43.490351 | orchestrator | TASK [osism.services.docker : Install python3-pip package (install python bindings from pip)] *** 2025-03-23 20:44:43.490892 | orchestrator | Sunday 23 March 2025 20:44:43 +0000 (0:00:00.619) 0:06:58.534 ********** 2025-03-23 20:44:43.682208 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:44:43.778784 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:44:43.861085 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:44:43.940171 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:44:44.005737 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:44:44.100685 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:44:44.101400 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:44:44.101831 | orchestrator | 2025-03-23 20:44:44.101861 | orchestrator | TASK [osism.services.docker : Install docker packages (install python bindings from pip)] *** 2025-03-23 20:44:44.102524 | orchestrator | Sunday 23 March 2025 20:44:44 +0000 (0:00:00.617) 0:06:59.151 ********** 2025-03-23 20:44:44.264283 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:44:44.343069 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:44:44.424207 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:44:44.499366 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:44:44.579350 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:44:44.709800 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:44:44.710120 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:44:44.710584 | orchestrator | 2025-03-23 20:44:44.711724 | orchestrator | TASK [osism.services.docker : Install packages required by docker login] ******* 2025-03-23 20:44:44.873677 | orchestrator | Sunday 23 March 2025 20:44:44 +0000 (0:00:00.610) 0:06:59.762 ********** 2025-03-23 20:44:44.873751 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:44:44.949644 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:44:45.014931 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:44:45.102221 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:44:45.181755 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:44:45.303586 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:44:45.304429 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:44:45.304474 | orchestrator | 2025-03-23 20:44:45.305907 | orchestrator | TASK [osism.services.docker : Ensure that some packages are not installed] ***** 2025-03-23 20:44:45.306101 | orchestrator | Sunday 23 March 2025 20:44:45 +0000 (0:00:00.589) 0:07:00.351 ********** 2025-03-23 20:44:52.584056 | orchestrator | ok: [testbed-manager] 2025-03-23 20:44:52.585052 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:44:52.586249 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:44:52.586421 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:44:52.586449 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:44:52.587696 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:44:52.589400 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:44:52.590060 | orchestrator | 2025-03-23 20:44:52.590097 | orchestrator | TASK [osism.services.docker : Include config tasks] **************************** 2025-03-23 20:44:52.590849 | orchestrator | Sunday 23 March 2025 20:44:52 +0000 (0:00:07.281) 0:07:07.633 ********** 2025-03-23 20:44:53.504467 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/config.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:44:53.505076 | orchestrator | 2025-03-23 20:44:53.506140 | orchestrator | TASK [osism.services.docker : Create plugins directory] ************************ 2025-03-23 20:44:53.507275 | orchestrator | Sunday 23 March 2025 20:44:53 +0000 (0:00:00.922) 0:07:08.556 ********** 2025-03-23 20:44:54.002581 | orchestrator | ok: [testbed-manager] 2025-03-23 20:44:54.468999 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:44:54.469185 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:44:54.469245 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:44:54.469986 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:44:54.471163 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:44:54.471728 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:44:54.472184 | orchestrator | 2025-03-23 20:44:54.472411 | orchestrator | TASK [osism.services.docker : Create systemd overlay directory] **************** 2025-03-23 20:44:54.473573 | orchestrator | Sunday 23 March 2025 20:44:54 +0000 (0:00:00.964) 0:07:09.521 ********** 2025-03-23 20:44:54.910307 | orchestrator | ok: [testbed-manager] 2025-03-23 20:44:55.579797 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:44:55.579974 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:44:55.580933 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:44:55.581933 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:44:55.582762 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:44:55.586290 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:44:57.043863 | orchestrator | 2025-03-23 20:44:57.043934 | orchestrator | TASK [osism.services.docker : Copy systemd overlay file] *********************** 2025-03-23 20:44:57.043951 | orchestrator | Sunday 23 March 2025 20:44:55 +0000 (0:00:01.109) 0:07:10.631 ********** 2025-03-23 20:44:57.043975 | orchestrator | ok: [testbed-manager] 2025-03-23 20:44:57.048014 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:44:57.048743 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:44:57.048766 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:44:57.048781 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:44:57.048795 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:44:57.048809 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:44:57.048847 | orchestrator | 2025-03-23 20:44:57.050230 | orchestrator | TASK [osism.services.docker : Reload systemd daemon if systemd overlay file is changed] *** 2025-03-23 20:44:57.050611 | orchestrator | Sunday 23 March 2025 20:44:57 +0000 (0:00:01.459) 0:07:12.091 ********** 2025-03-23 20:44:57.193364 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:44:58.517123 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:44:58.518140 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:44:58.519250 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:44:58.520430 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:44:58.521720 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:44:58.522589 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:44:58.523467 | orchestrator | 2025-03-23 20:44:58.524201 | orchestrator | TASK [osism.services.docker : Copy limits configuration file] ****************** 2025-03-23 20:44:58.525193 | orchestrator | Sunday 23 March 2025 20:44:58 +0000 (0:00:01.477) 0:07:13.568 ********** 2025-03-23 20:44:59.906826 | orchestrator | ok: [testbed-manager] 2025-03-23 20:44:59.907523 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:44:59.907830 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:44:59.908624 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:44:59.910601 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:44:59.911284 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:44:59.911666 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:44:59.912470 | orchestrator | 2025-03-23 20:44:59.913178 | orchestrator | TASK [osism.services.docker : Copy daemon.json configuration file] ************* 2025-03-23 20:44:59.914846 | orchestrator | Sunday 23 March 2025 20:44:59 +0000 (0:00:01.387) 0:07:14.955 ********** 2025-03-23 20:45:01.465730 | orchestrator | changed: [testbed-manager] 2025-03-23 20:45:01.467088 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:45:01.467132 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:45:01.468966 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:45:01.470639 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:45:01.471723 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:45:01.472858 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:45:01.473621 | orchestrator | 2025-03-23 20:45:01.474449 | orchestrator | TASK [osism.services.docker : Include service tasks] *************************** 2025-03-23 20:45:01.475016 | orchestrator | Sunday 23 March 2025 20:45:01 +0000 (0:00:01.558) 0:07:16.514 ********** 2025-03-23 20:45:02.638418 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/service.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:45:02.639523 | orchestrator | 2025-03-23 20:45:02.643922 | orchestrator | TASK [osism.services.docker : Reload systemd daemon] *************************** 2025-03-23 20:45:04.356823 | orchestrator | Sunday 23 March 2025 20:45:02 +0000 (0:00:01.173) 0:07:17.687 ********** 2025-03-23 20:45:04.356965 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:04.358271 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:04.358307 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:04.358803 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:04.360015 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:04.360469 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:04.361026 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:04.361821 | orchestrator | 2025-03-23 20:45:04.363635 | orchestrator | TASK [osism.services.docker : Manage service] ********************************** 2025-03-23 20:45:04.364733 | orchestrator | Sunday 23 March 2025 20:45:04 +0000 (0:00:01.716) 0:07:19.404 ********** 2025-03-23 20:45:05.637651 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:05.637825 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:05.637856 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:05.638148 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:05.638767 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:05.638949 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:05.639639 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:05.639741 | orchestrator | 2025-03-23 20:45:05.640231 | orchestrator | TASK [osism.services.docker : Manage docker socket service] ******************** 2025-03-23 20:45:05.640378 | orchestrator | Sunday 23 March 2025 20:45:05 +0000 (0:00:01.283) 0:07:20.688 ********** 2025-03-23 20:45:06.867800 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:06.868174 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:06.868532 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:06.869655 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:06.870463 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:06.871084 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:06.871111 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:06.871553 | orchestrator | 2025-03-23 20:45:06.872670 | orchestrator | TASK [osism.services.docker : Manage containerd service] *********************** 2025-03-23 20:45:06.872818 | orchestrator | Sunday 23 March 2025 20:45:06 +0000 (0:00:01.228) 0:07:21.916 ********** 2025-03-23 20:45:08.329718 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:08.329951 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:08.331961 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:08.332025 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:08.333636 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:08.334300 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:08.334881 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:08.335964 | orchestrator | 2025-03-23 20:45:08.336635 | orchestrator | TASK [osism.services.docker : Include bootstrap tasks] ************************* 2025-03-23 20:45:08.337187 | orchestrator | Sunday 23 March 2025 20:45:08 +0000 (0:00:01.463) 0:07:23.380 ********** 2025-03-23 20:45:09.603030 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/bootstrap.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:45:09.603213 | orchestrator | 2025-03-23 20:45:09.603800 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-23 20:45:09.604897 | orchestrator | Sunday 23 March 2025 20:45:09 +0000 (0:00:00.963) 0:07:24.344 ********** 2025-03-23 20:45:09.606152 | orchestrator | 2025-03-23 20:45:09.607049 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-23 20:45:09.607085 | orchestrator | Sunday 23 March 2025 20:45:09 +0000 (0:00:00.047) 0:07:24.391 ********** 2025-03-23 20:45:09.609010 | orchestrator | 2025-03-23 20:45:09.610143 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-23 20:45:09.610960 | orchestrator | Sunday 23 March 2025 20:45:09 +0000 (0:00:00.041) 0:07:24.432 ********** 2025-03-23 20:45:09.611645 | orchestrator | 2025-03-23 20:45:09.612986 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-23 20:45:09.614135 | orchestrator | Sunday 23 March 2025 20:45:09 +0000 (0:00:00.049) 0:07:24.482 ********** 2025-03-23 20:45:09.615243 | orchestrator | 2025-03-23 20:45:09.616469 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-23 20:45:09.617222 | orchestrator | Sunday 23 March 2025 20:45:09 +0000 (0:00:00.040) 0:07:24.522 ********** 2025-03-23 20:45:09.618118 | orchestrator | 2025-03-23 20:45:09.618892 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-23 20:45:09.619831 | orchestrator | Sunday 23 March 2025 20:45:09 +0000 (0:00:00.040) 0:07:24.562 ********** 2025-03-23 20:45:09.620782 | orchestrator | 2025-03-23 20:45:09.621747 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-23 20:45:09.622148 | orchestrator | Sunday 23 March 2025 20:45:09 +0000 (0:00:00.047) 0:07:24.609 ********** 2025-03-23 20:45:09.622970 | orchestrator | 2025-03-23 20:45:09.623826 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-03-23 20:45:09.625578 | orchestrator | Sunday 23 March 2025 20:45:09 +0000 (0:00:00.041) 0:07:24.651 ********** 2025-03-23 20:45:10.859754 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:10.859974 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:10.860720 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:10.860758 | orchestrator | 2025-03-23 20:45:10.861132 | orchestrator | RUNNING HANDLER [osism.services.rsyslog : Restart rsyslog service] ************* 2025-03-23 20:45:10.861711 | orchestrator | Sunday 23 March 2025 20:45:10 +0000 (0:00:01.256) 0:07:25.908 ********** 2025-03-23 20:45:12.723284 | orchestrator | changed: [testbed-manager] 2025-03-23 20:45:12.723703 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:45:12.723745 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:45:12.724395 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:45:12.724666 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:45:12.725758 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:45:12.726127 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:45:12.726156 | orchestrator | 2025-03-23 20:45:12.726178 | orchestrator | RUNNING HANDLER [osism.services.smartd : Restart smartd service] *************** 2025-03-23 20:45:12.729990 | orchestrator | Sunday 23 March 2025 20:45:12 +0000 (0:00:01.864) 0:07:27.773 ********** 2025-03-23 20:45:13.919927 | orchestrator | changed: [testbed-manager] 2025-03-23 20:45:13.923032 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:45:13.923094 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:45:13.924383 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:45:13.925653 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:45:13.926433 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:45:13.927410 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:45:13.929173 | orchestrator | 2025-03-23 20:45:13.929720 | orchestrator | RUNNING HANDLER [osism.services.docker : Restart docker service] *************** 2025-03-23 20:45:13.932463 | orchestrator | Sunday 23 March 2025 20:45:13 +0000 (0:00:01.188) 0:07:28.962 ********** 2025-03-23 20:45:14.095318 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:45:16.161559 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:45:16.162183 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:45:16.163685 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:45:16.167087 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:45:16.167485 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:45:16.167795 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:45:16.168593 | orchestrator | 2025-03-23 20:45:16.168772 | orchestrator | RUNNING HANDLER [osism.services.docker : Wait after docker service restart] **** 2025-03-23 20:45:16.168819 | orchestrator | Sunday 23 March 2025 20:45:16 +0000 (0:00:02.247) 0:07:31.209 ********** 2025-03-23 20:45:16.304312 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:45:16.305328 | orchestrator | 2025-03-23 20:45:16.305900 | orchestrator | TASK [osism.services.docker : Add user to docker group] ************************ 2025-03-23 20:45:16.306993 | orchestrator | Sunday 23 March 2025 20:45:16 +0000 (0:00:00.142) 0:07:31.351 ********** 2025-03-23 20:45:17.408711 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:17.408908 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:45:17.409250 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:45:17.409999 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:45:17.410744 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:45:17.410824 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:45:17.411312 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:45:17.411670 | orchestrator | 2025-03-23 20:45:17.412014 | orchestrator | TASK [osism.services.docker : Log into private registry and force re-authorization] *** 2025-03-23 20:45:17.412924 | orchestrator | Sunday 23 March 2025 20:45:17 +0000 (0:00:01.104) 0:07:32.456 ********** 2025-03-23 20:45:17.569627 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:45:17.642702 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:45:17.717684 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:45:18.020471 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:45:18.127686 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:45:18.256692 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:45:18.256824 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:45:18.258319 | orchestrator | 2025-03-23 20:45:18.258688 | orchestrator | TASK [osism.services.docker : Include facts tasks] ***************************** 2025-03-23 20:45:18.259543 | orchestrator | Sunday 23 March 2025 20:45:18 +0000 (0:00:00.850) 0:07:33.306 ********** 2025-03-23 20:45:19.228415 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/facts.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:45:19.229562 | orchestrator | 2025-03-23 20:45:19.229688 | orchestrator | TASK [osism.services.docker : Create facts directory] ************************** 2025-03-23 20:45:19.230544 | orchestrator | Sunday 23 March 2025 20:45:19 +0000 (0:00:00.973) 0:07:34.280 ********** 2025-03-23 20:45:19.676373 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:20.145378 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:20.146133 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:20.147578 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:20.147932 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:20.149273 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:20.149777 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:20.150471 | orchestrator | 2025-03-23 20:45:20.151571 | orchestrator | TASK [osism.services.docker : Copy docker fact files] ************************** 2025-03-23 20:45:22.958088 | orchestrator | Sunday 23 March 2025 20:45:20 +0000 (0:00:00.914) 0:07:35.195 ********** 2025-03-23 20:45:22.958220 | orchestrator | ok: [testbed-manager] => (item=docker_containers) 2025-03-23 20:45:22.958763 | orchestrator | changed: [testbed-node-3] => (item=docker_containers) 2025-03-23 20:45:22.959193 | orchestrator | changed: [testbed-node-0] => (item=docker_containers) 2025-03-23 20:45:22.961245 | orchestrator | changed: [testbed-node-4] => (item=docker_containers) 2025-03-23 20:45:22.963655 | orchestrator | changed: [testbed-node-5] => (item=docker_containers) 2025-03-23 20:45:22.964572 | orchestrator | changed: [testbed-node-1] => (item=docker_containers) 2025-03-23 20:45:22.965394 | orchestrator | changed: [testbed-node-2] => (item=docker_containers) 2025-03-23 20:45:22.965957 | orchestrator | ok: [testbed-manager] => (item=docker_images) 2025-03-23 20:45:22.966688 | orchestrator | changed: [testbed-node-3] => (item=docker_images) 2025-03-23 20:45:22.967774 | orchestrator | changed: [testbed-node-4] => (item=docker_images) 2025-03-23 20:45:22.968461 | orchestrator | changed: [testbed-node-5] => (item=docker_images) 2025-03-23 20:45:22.968992 | orchestrator | changed: [testbed-node-1] => (item=docker_images) 2025-03-23 20:45:22.969412 | orchestrator | changed: [testbed-node-0] => (item=docker_images) 2025-03-23 20:45:22.970709 | orchestrator | changed: [testbed-node-2] => (item=docker_images) 2025-03-23 20:45:22.972327 | orchestrator | 2025-03-23 20:45:22.973507 | orchestrator | TASK [osism.commons.docker_compose : This install type is not supported] ******* 2025-03-23 20:45:22.974001 | orchestrator | Sunday 23 March 2025 20:45:22 +0000 (0:00:02.812) 0:07:38.007 ********** 2025-03-23 20:45:23.098249 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:45:23.163403 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:45:23.227893 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:45:23.313734 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:45:23.378916 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:45:23.532062 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:45:23.534355 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:45:23.534387 | orchestrator | 2025-03-23 20:45:23.534708 | orchestrator | TASK [osism.commons.docker_compose : Include distribution specific install tasks] *** 2025-03-23 20:45:23.534738 | orchestrator | Sunday 23 March 2025 20:45:23 +0000 (0:00:00.574) 0:07:38.582 ********** 2025-03-23 20:45:24.490621 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/docker_compose/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:45:24.491005 | orchestrator | 2025-03-23 20:45:24.492233 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose apt preferences file] *** 2025-03-23 20:45:24.493674 | orchestrator | Sunday 23 March 2025 20:45:24 +0000 (0:00:00.956) 0:07:39.538 ********** 2025-03-23 20:45:24.975386 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:25.433350 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:25.433570 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:25.434689 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:25.435768 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:25.436115 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:25.437631 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:25.439070 | orchestrator | 2025-03-23 20:45:25.439932 | orchestrator | TASK [osism.commons.docker_compose : Get checksum of docker-compose file] ****** 2025-03-23 20:45:25.441402 | orchestrator | Sunday 23 March 2025 20:45:25 +0000 (0:00:00.942) 0:07:40.480 ********** 2025-03-23 20:45:25.981696 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:26.194305 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:26.705009 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:26.705978 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:26.706069 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:26.707478 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:26.708467 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:26.709251 | orchestrator | 2025-03-23 20:45:26.709609 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose binary] ************* 2025-03-23 20:45:26.709996 | orchestrator | Sunday 23 March 2025 20:45:26 +0000 (0:00:01.274) 0:07:41.755 ********** 2025-03-23 20:45:26.858166 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:45:26.934411 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:45:27.019091 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:45:27.087508 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:45:27.162448 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:45:27.275773 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:45:27.276371 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:45:27.277277 | orchestrator | 2025-03-23 20:45:27.278203 | orchestrator | TASK [osism.commons.docker_compose : Uninstall docker-compose package] ********* 2025-03-23 20:45:27.279635 | orchestrator | Sunday 23 March 2025 20:45:27 +0000 (0:00:00.568) 0:07:42.324 ********** 2025-03-23 20:45:28.889114 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:28.890243 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:28.891680 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:28.892292 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:28.893483 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:28.894200 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:28.895065 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:28.896577 | orchestrator | 2025-03-23 20:45:28.897192 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose script] *************** 2025-03-23 20:45:28.897440 | orchestrator | Sunday 23 March 2025 20:45:28 +0000 (0:00:01.613) 0:07:43.937 ********** 2025-03-23 20:45:29.035447 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:45:29.113999 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:45:29.192000 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:45:29.260798 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:45:29.333702 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:45:29.439021 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:45:29.443483 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:45:29.443847 | orchestrator | 2025-03-23 20:45:29.445338 | orchestrator | TASK [osism.commons.docker_compose : Install docker-compose-plugin package] **** 2025-03-23 20:45:29.445438 | orchestrator | Sunday 23 March 2025 20:45:29 +0000 (0:00:00.554) 0:07:44.491 ********** 2025-03-23 20:45:31.732978 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:31.733815 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:31.734450 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:31.735557 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:31.739485 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:31.739764 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:31.740608 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:31.741612 | orchestrator | 2025-03-23 20:45:31.741859 | orchestrator | TASK [osism.commons.docker_compose : Copy osism.target systemd file] *********** 2025-03-23 20:45:31.741964 | orchestrator | Sunday 23 March 2025 20:45:31 +0000 (0:00:02.291) 0:07:46.783 ********** 2025-03-23 20:45:33.256578 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:33.257124 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:45:33.258729 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:45:33.259089 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:45:33.260276 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:45:33.261211 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:45:33.261657 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:45:33.262481 | orchestrator | 2025-03-23 20:45:33.263385 | orchestrator | TASK [osism.commons.docker_compose : Enable osism.target] ********************** 2025-03-23 20:45:33.264193 | orchestrator | Sunday 23 March 2025 20:45:33 +0000 (0:00:01.523) 0:07:48.306 ********** 2025-03-23 20:45:35.148262 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:35.148484 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:45:35.149315 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:45:35.149673 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:45:35.151449 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:45:35.151860 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:45:35.152434 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:45:35.152702 | orchestrator | 2025-03-23 20:45:35.153703 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose systemd unit file] **** 2025-03-23 20:45:35.153896 | orchestrator | Sunday 23 March 2025 20:45:35 +0000 (0:00:01.892) 0:07:50.198 ********** 2025-03-23 20:45:36.925639 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:36.925987 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:45:36.926422 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:45:36.926857 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:45:36.927780 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:45:36.928186 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:45:36.929003 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:45:36.929669 | orchestrator | 2025-03-23 20:45:36.930071 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-03-23 20:45:36.930554 | orchestrator | Sunday 23 March 2025 20:45:36 +0000 (0:00:01.775) 0:07:51.974 ********** 2025-03-23 20:45:37.706912 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:37.825158 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:38.272797 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:38.273670 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:38.277366 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:38.430211 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:38.430299 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:38.430315 | orchestrator | 2025-03-23 20:45:38.430332 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-03-23 20:45:38.430348 | orchestrator | Sunday 23 March 2025 20:45:38 +0000 (0:00:01.348) 0:07:53.322 ********** 2025-03-23 20:45:38.430375 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:45:38.504960 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:45:38.572122 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:45:38.648450 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:45:38.725934 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:45:39.150712 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:45:39.150839 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:45:39.151617 | orchestrator | 2025-03-23 20:45:39.151928 | orchestrator | TASK [osism.services.chrony : Check minimum and maximum number of servers] ***** 2025-03-23 20:45:39.152429 | orchestrator | Sunday 23 March 2025 20:45:39 +0000 (0:00:00.879) 0:07:54.201 ********** 2025-03-23 20:45:39.305498 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:45:39.384785 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:45:39.461836 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:45:39.531713 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:45:39.616670 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:45:39.737648 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:45:39.738649 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:45:39.739381 | orchestrator | 2025-03-23 20:45:39.742814 | orchestrator | TASK [osism.services.chrony : Gather variables for each operating system] ****** 2025-03-23 20:45:39.743953 | orchestrator | Sunday 23 March 2025 20:45:39 +0000 (0:00:00.586) 0:07:54.788 ********** 2025-03-23 20:45:39.899500 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:39.981756 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:40.047600 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:40.110767 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:40.187052 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:40.302855 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:40.306446 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:40.453759 | orchestrator | 2025-03-23 20:45:40.453820 | orchestrator | TASK [osism.services.chrony : Set chrony_conf_file variable to default value] *** 2025-03-23 20:45:40.453836 | orchestrator | Sunday 23 March 2025 20:45:40 +0000 (0:00:00.562) 0:07:55.351 ********** 2025-03-23 20:45:40.453860 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:40.749708 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:40.814511 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:40.937462 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:41.021992 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:41.143597 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:41.144551 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:41.145806 | orchestrator | 2025-03-23 20:45:41.146131 | orchestrator | TASK [osism.services.chrony : Set chrony_key_file variable to default value] *** 2025-03-23 20:45:41.147028 | orchestrator | Sunday 23 March 2025 20:45:41 +0000 (0:00:00.843) 0:07:56.194 ********** 2025-03-23 20:45:41.292451 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:41.367320 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:41.442594 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:41.525358 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:41.604816 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:41.737263 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:41.737699 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:41.738645 | orchestrator | 2025-03-23 20:45:41.744806 | orchestrator | TASK [osism.services.chrony : Populate service facts] ************************** 2025-03-23 20:45:46.833794 | orchestrator | Sunday 23 March 2025 20:45:41 +0000 (0:00:00.592) 0:07:56.787 ********** 2025-03-23 20:45:46.833948 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:46.834154 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:46.834181 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:46.834202 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:46.835537 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:46.838792 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:47.003604 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:47.003705 | orchestrator | 2025-03-23 20:45:47.003722 | orchestrator | TASK [osism.services.chrony : Manage timesyncd service] ************************ 2025-03-23 20:45:47.003738 | orchestrator | Sunday 23 March 2025 20:45:46 +0000 (0:00:05.097) 0:08:01.884 ********** 2025-03-23 20:45:47.003767 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:45:47.078410 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:45:47.155514 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:45:47.242550 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:45:47.330595 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:45:47.463911 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:45:47.464310 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:45:47.465128 | orchestrator | 2025-03-23 20:45:47.466284 | orchestrator | TASK [osism.services.chrony : Include distribution specific install tasks] ***** 2025-03-23 20:45:47.466453 | orchestrator | Sunday 23 March 2025 20:45:47 +0000 (0:00:00.630) 0:08:02.514 ********** 2025-03-23 20:45:48.663793 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:45:48.663961 | orchestrator | 2025-03-23 20:45:48.665338 | orchestrator | TASK [osism.services.chrony : Install package] ********************************* 2025-03-23 20:45:48.665410 | orchestrator | Sunday 23 March 2025 20:45:48 +0000 (0:00:01.197) 0:08:03.712 ********** 2025-03-23 20:45:50.674923 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:50.675089 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:50.677377 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:50.678205 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:50.678238 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:50.678261 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:50.679768 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:50.680392 | orchestrator | 2025-03-23 20:45:50.680913 | orchestrator | TASK [osism.services.chrony : Manage chrony service] *************************** 2025-03-23 20:45:50.681705 | orchestrator | Sunday 23 March 2025 20:45:50 +0000 (0:00:02.010) 0:08:05.723 ********** 2025-03-23 20:45:51.993092 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:51.993427 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:51.993467 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:51.994277 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:51.994391 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:51.995374 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:51.995717 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:51.998544 | orchestrator | 2025-03-23 20:45:52.000822 | orchestrator | TASK [osism.services.chrony : Check if configuration file exists] ************** 2025-03-23 20:45:52.000857 | orchestrator | Sunday 23 March 2025 20:45:51 +0000 (0:00:01.315) 0:08:07.038 ********** 2025-03-23 20:45:52.477226 | orchestrator | ok: [testbed-manager] 2025-03-23 20:45:52.910587 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:45:52.911701 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:45:52.912572 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:45:52.912606 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:45:52.916437 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:45:52.916859 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:45:52.918099 | orchestrator | 2025-03-23 20:45:52.919081 | orchestrator | TASK [osism.services.chrony : Copy configuration file] ************************* 2025-03-23 20:45:52.919608 | orchestrator | Sunday 23 March 2025 20:45:52 +0000 (0:00:00.919) 0:08:07.958 ********** 2025-03-23 20:45:54.967063 | orchestrator | changed: [testbed-manager] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-23 20:45:54.969176 | orchestrator | changed: [testbed-node-3] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-23 20:45:54.969221 | orchestrator | changed: [testbed-node-4] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-23 20:45:54.970776 | orchestrator | changed: [testbed-node-5] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-23 20:45:54.976274 | orchestrator | changed: [testbed-node-0] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-23 20:45:54.976652 | orchestrator | changed: [testbed-node-1] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-23 20:45:54.976679 | orchestrator | changed: [testbed-node-2] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-23 20:45:54.976699 | orchestrator | 2025-03-23 20:45:54.977818 | orchestrator | TASK [osism.services.lldpd : Include distribution specific install tasks] ****** 2025-03-23 20:45:54.978150 | orchestrator | Sunday 23 March 2025 20:45:54 +0000 (0:00:02.058) 0:08:10.016 ********** 2025-03-23 20:45:55.854610 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/lldpd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:45:55.859071 | orchestrator | 2025-03-23 20:46:05.436629 | orchestrator | TASK [osism.services.lldpd : Install lldpd package] **************************** 2025-03-23 20:46:05.436756 | orchestrator | Sunday 23 March 2025 20:45:55 +0000 (0:00:00.886) 0:08:10.903 ********** 2025-03-23 20:46:05.436800 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:46:05.437125 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:46:05.440748 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:46:05.441436 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:46:05.441472 | orchestrator | changed: [testbed-manager] 2025-03-23 20:46:05.442442 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:46:05.443013 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:46:05.443433 | orchestrator | 2025-03-23 20:46:05.443786 | orchestrator | TASK [osism.services.lldpd : Manage lldpd service] ***************************** 2025-03-23 20:46:05.444191 | orchestrator | Sunday 23 March 2025 20:46:05 +0000 (0:00:09.583) 0:08:20.487 ********** 2025-03-23 20:46:07.675934 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:07.676089 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:07.677252 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:07.680945 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:07.681082 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:07.681111 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:07.681589 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:07.682931 | orchestrator | 2025-03-23 20:46:07.683314 | orchestrator | RUNNING HANDLER [osism.commons.docker_compose : Reload systemd daemon] ********* 2025-03-23 20:46:07.684180 | orchestrator | Sunday 23 March 2025 20:46:07 +0000 (0:00:02.239) 0:08:22.727 ********** 2025-03-23 20:46:09.080123 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:09.081223 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:09.081877 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:09.083925 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:09.084896 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:09.084927 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:09.085663 | orchestrator | 2025-03-23 20:46:09.086179 | orchestrator | RUNNING HANDLER [osism.services.chrony : Restart chrony service] *************** 2025-03-23 20:46:09.086796 | orchestrator | Sunday 23 March 2025 20:46:09 +0000 (0:00:01.403) 0:08:24.130 ********** 2025-03-23 20:46:10.774273 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:46:10.774432 | orchestrator | changed: [testbed-manager] 2025-03-23 20:46:10.774935 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:46:10.774968 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:46:10.775811 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:46:10.775869 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:46:10.777887 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:46:10.778257 | orchestrator | 2025-03-23 20:46:10.778290 | orchestrator | PLAY [Apply bootstrap role part 2] ********************************************* 2025-03-23 20:46:10.778626 | orchestrator | 2025-03-23 20:46:10.778654 | orchestrator | TASK [Include hardening role] ************************************************** 2025-03-23 20:46:10.778674 | orchestrator | Sunday 23 March 2025 20:46:10 +0000 (0:00:01.694) 0:08:25.825 ********** 2025-03-23 20:46:10.923504 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:46:11.014082 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:46:11.211330 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:46:11.234741 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:46:11.234810 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:46:11.358656 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:46:11.359294 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:46:11.362460 | orchestrator | 2025-03-23 20:46:11.362495 | orchestrator | PLAY [Apply bootstrap roles part 3] ******************************************** 2025-03-23 20:46:11.362590 | orchestrator | 2025-03-23 20:46:11.363168 | orchestrator | TASK [osism.services.journald : Copy configuration file] *********************** 2025-03-23 20:46:11.363964 | orchestrator | Sunday 23 March 2025 20:46:11 +0000 (0:00:00.583) 0:08:26.408 ********** 2025-03-23 20:46:12.840900 | orchestrator | changed: [testbed-manager] 2025-03-23 20:46:12.841231 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:46:12.842313 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:46:12.843445 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:46:12.847564 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:46:12.847667 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:46:12.847690 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:46:12.847704 | orchestrator | 2025-03-23 20:46:12.847720 | orchestrator | TASK [osism.services.journald : Manage journald service] *********************** 2025-03-23 20:46:12.847769 | orchestrator | Sunday 23 March 2025 20:46:12 +0000 (0:00:01.482) 0:08:27.890 ********** 2025-03-23 20:46:14.440935 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:14.443300 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:14.443649 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:14.444711 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:14.445820 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:14.446509 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:14.447578 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:14.447939 | orchestrator | 2025-03-23 20:46:14.448915 | orchestrator | TASK [Include auditd role] ***************************************************** 2025-03-23 20:46:14.449738 | orchestrator | Sunday 23 March 2025 20:46:14 +0000 (0:00:01.598) 0:08:29.489 ********** 2025-03-23 20:46:14.568848 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:46:14.901580 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:46:14.964341 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:46:15.041467 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:46:15.118135 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:46:15.544990 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:46:15.545455 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:46:15.546178 | orchestrator | 2025-03-23 20:46:15.548585 | orchestrator | RUNNING HANDLER [osism.services.journald : Restart journald service] *********** 2025-03-23 20:46:15.549261 | orchestrator | Sunday 23 March 2025 20:46:15 +0000 (0:00:01.107) 0:08:30.596 ********** 2025-03-23 20:46:16.922630 | orchestrator | changed: [testbed-manager] 2025-03-23 20:46:16.922820 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:46:16.924732 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:46:16.924765 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:46:16.926692 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:46:16.927071 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:46:16.928466 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:46:16.928634 | orchestrator | 2025-03-23 20:46:16.929910 | orchestrator | PLAY [Set state bootstrap] ***************************************************** 2025-03-23 20:46:16.930498 | orchestrator | 2025-03-23 20:46:16.931353 | orchestrator | TASK [Set osism.bootstrap.status fact] ***************************************** 2025-03-23 20:46:16.932951 | orchestrator | Sunday 23 March 2025 20:46:16 +0000 (0:00:01.375) 0:08:31.972 ********** 2025-03-23 20:46:17.810069 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:46:17.810900 | orchestrator | 2025-03-23 20:46:17.812856 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2025-03-23 20:46:17.813654 | orchestrator | Sunday 23 March 2025 20:46:17 +0000 (0:00:00.887) 0:08:32.859 ********** 2025-03-23 20:46:18.316876 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:18.987591 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:18.988076 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:18.990305 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:18.990630 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:18.990658 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:18.992005 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:18.992765 | orchestrator | 2025-03-23 20:46:18.994106 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2025-03-23 20:46:18.994907 | orchestrator | Sunday 23 March 2025 20:46:18 +0000 (0:00:01.178) 0:08:34.038 ********** 2025-03-23 20:46:20.216386 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:46:20.216691 | orchestrator | changed: [testbed-manager] 2025-03-23 20:46:20.218237 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:46:20.219898 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:46:20.219932 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:46:20.220844 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:46:20.222312 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:46:20.222766 | orchestrator | 2025-03-23 20:46:20.224067 | orchestrator | TASK [Set osism.bootstrap.timestamp fact] ************************************** 2025-03-23 20:46:20.224598 | orchestrator | Sunday 23 March 2025 20:46:20 +0000 (0:00:01.227) 0:08:35.265 ********** 2025-03-23 20:46:21.339031 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 20:46:21.339944 | orchestrator | 2025-03-23 20:46:21.341891 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2025-03-23 20:46:21.797885 | orchestrator | Sunday 23 March 2025 20:46:21 +0000 (0:00:01.123) 0:08:36.388 ********** 2025-03-23 20:46:21.798004 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:22.263824 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:22.264430 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:22.266625 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:22.267722 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:22.268637 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:22.269406 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:22.269833 | orchestrator | 2025-03-23 20:46:22.270792 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2025-03-23 20:46:22.271407 | orchestrator | Sunday 23 March 2025 20:46:22 +0000 (0:00:00.924) 0:08:37.313 ********** 2025-03-23 20:46:23.601746 | orchestrator | changed: [testbed-manager] 2025-03-23 20:46:23.602279 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:46:23.603192 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:46:23.604233 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:46:23.604950 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:46:23.605935 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:46:23.607228 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:46:23.609432 | orchestrator | 2025-03-23 20:46:23.611152 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:46:23.611620 | orchestrator | 2025-03-23 20:46:23 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:46:23.612314 | orchestrator | 2025-03-23 20:46:23 | INFO  | Please wait and do not abort execution. 2025-03-23 20:46:23.613255 | orchestrator | testbed-manager : ok=160  changed=38  unreachable=0 failed=0 skipped=41  rescued=0 ignored=0 2025-03-23 20:46:23.614732 | orchestrator | testbed-node-0 : ok=168  changed=65  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-03-23 20:46:23.615813 | orchestrator | testbed-node-1 : ok=168  changed=65  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-03-23 20:46:23.616600 | orchestrator | testbed-node-2 : ok=168  changed=65  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-03-23 20:46:23.617496 | orchestrator | testbed-node-3 : ok=167  changed=62  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2025-03-23 20:46:23.618193 | orchestrator | testbed-node-4 : ok=167  changed=62  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-03-23 20:46:23.618758 | orchestrator | testbed-node-5 : ok=167  changed=62  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-03-23 20:46:23.619927 | orchestrator | 2025-03-23 20:46:23.622109 | orchestrator | Sunday 23 March 2025 20:46:23 +0000 (0:00:01.339) 0:08:38.652 ********** 2025-03-23 20:46:23.622757 | orchestrator | =============================================================================== 2025-03-23 20:46:23.623307 | orchestrator | osism.commons.packages : Install required packages --------------------- 78.74s 2025-03-23 20:46:23.623821 | orchestrator | osism.commons.packages : Download required packages -------------------- 42.28s 2025-03-23 20:46:23.624327 | orchestrator | osism.commons.cleanup : Cleanup installed packages --------------------- 33.77s 2025-03-23 20:46:23.625010 | orchestrator | osism.commons.repository : Update package cache ------------------------ 14.62s 2025-03-23 20:46:23.625557 | orchestrator | osism.commons.systohc : Install util-linux-extra package --------------- 13.90s 2025-03-23 20:46:23.626012 | orchestrator | osism.services.docker : Install docker-cli package --------------------- 13.89s 2025-03-23 20:46:23.626694 | orchestrator | osism.services.docker : Install docker package ------------------------- 13.46s 2025-03-23 20:46:23.627151 | orchestrator | osism.commons.packages : Remove dependencies that are no longer required -- 13.32s 2025-03-23 20:46:23.627960 | orchestrator | osism.services.docker : Install containerd package --------------------- 11.07s 2025-03-23 20:46:23.629579 | orchestrator | osism.services.smartd : Install smartmontools package ------------------- 9.67s 2025-03-23 20:46:23.630279 | orchestrator | osism.services.lldpd : Install lldpd package ---------------------------- 9.58s 2025-03-23 20:46:23.630967 | orchestrator | osism.services.rng : Install rng package -------------------------------- 9.37s 2025-03-23 20:46:23.631930 | orchestrator | osism.services.docker : Add repository ---------------------------------- 8.98s 2025-03-23 20:46:23.632445 | orchestrator | osism.commons.cleanup : Remove cloudinit package ------------------------ 8.95s 2025-03-23 20:46:23.634814 | orchestrator | osism.commons.cleanup : Uninstall unattended-upgrades package ----------- 8.74s 2025-03-23 20:46:23.635723 | orchestrator | osism.services.docker : Install apt-transport-https package ------------- 7.65s 2025-03-23 20:46:23.636797 | orchestrator | osism.services.docker : Ensure that some packages are not installed ----- 7.28s 2025-03-23 20:46:23.637837 | orchestrator | osism.commons.sysctl : Set sysctl parameters on rabbitmq ---------------- 7.03s 2025-03-23 20:46:23.638651 | orchestrator | osism.commons.cleanup : Remove dependencies that are no longer required --- 6.61s 2025-03-23 20:46:23.639868 | orchestrator | osism.commons.services : Populate service facts ------------------------- 5.82s 2025-03-23 20:46:24.492362 | orchestrator | + [[ -e /etc/redhat-release ]] 2025-03-23 20:46:26.909632 | orchestrator | + osism apply network 2025-03-23 20:46:26.909763 | orchestrator | 2025-03-23 20:46:26 | INFO  | Task 9e59ec9d-27b8-4578-8a95-dcc25937933d (network) was prepared for execution. 2025-03-23 20:46:30.835469 | orchestrator | 2025-03-23 20:46:26 | INFO  | It takes a moment until task 9e59ec9d-27b8-4578-8a95-dcc25937933d (network) has been started and output is visible here. 2025-03-23 20:46:30.835659 | orchestrator | 2025-03-23 20:46:30.835741 | orchestrator | PLAY [Apply role network] ****************************************************** 2025-03-23 20:46:30.836661 | orchestrator | 2025-03-23 20:46:30.837588 | orchestrator | TASK [osism.commons.network : Gather variables for each operating system] ****** 2025-03-23 20:46:30.839347 | orchestrator | Sunday 23 March 2025 20:46:30 +0000 (0:00:00.255) 0:00:00.255 ********** 2025-03-23 20:46:31.076184 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:31.167583 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:31.285641 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:31.384362 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:31.463792 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:31.723914 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:31.724096 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:31.724986 | orchestrator | 2025-03-23 20:46:31.725184 | orchestrator | TASK [osism.commons.network : Include type specific tasks] ********************* 2025-03-23 20:46:31.725325 | orchestrator | Sunday 23 March 2025 20:46:31 +0000 (0:00:00.891) 0:00:01.147 ********** 2025-03-23 20:46:32.977687 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/netplan-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-23 20:46:32.979337 | orchestrator | 2025-03-23 20:46:32.980107 | orchestrator | TASK [osism.commons.network : Install required packages] *********************** 2025-03-23 20:46:32.984359 | orchestrator | Sunday 23 March 2025 20:46:32 +0000 (0:00:01.250) 0:00:02.397 ********** 2025-03-23 20:46:35.057227 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:35.057368 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:35.057393 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:35.057527 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:35.057916 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:35.058354 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:35.058744 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:35.061746 | orchestrator | 2025-03-23 20:46:35.061899 | orchestrator | TASK [osism.commons.network : Remove ifupdown package] ************************* 2025-03-23 20:46:36.957767 | orchestrator | Sunday 23 March 2025 20:46:35 +0000 (0:00:02.080) 0:00:04.478 ********** 2025-03-23 20:46:36.957915 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:36.957994 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:36.962834 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:36.963804 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:36.964030 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:36.965257 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:36.965464 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:36.967923 | orchestrator | 2025-03-23 20:46:36.968699 | orchestrator | TASK [osism.commons.network : Create required directories] ********************* 2025-03-23 20:46:36.969235 | orchestrator | Sunday 23 March 2025 20:46:36 +0000 (0:00:01.898) 0:00:06.376 ********** 2025-03-23 20:46:37.530437 | orchestrator | ok: [testbed-manager] => (item=/etc/netplan) 2025-03-23 20:46:37.533880 | orchestrator | ok: [testbed-node-0] => (item=/etc/netplan) 2025-03-23 20:46:38.230446 | orchestrator | ok: [testbed-node-1] => (item=/etc/netplan) 2025-03-23 20:46:38.230530 | orchestrator | ok: [testbed-node-2] => (item=/etc/netplan) 2025-03-23 20:46:38.231194 | orchestrator | ok: [testbed-node-3] => (item=/etc/netplan) 2025-03-23 20:46:38.231917 | orchestrator | ok: [testbed-node-4] => (item=/etc/netplan) 2025-03-23 20:46:38.233908 | orchestrator | ok: [testbed-node-5] => (item=/etc/netplan) 2025-03-23 20:46:38.234141 | orchestrator | 2025-03-23 20:46:38.234173 | orchestrator | TASK [osism.commons.network : Prepare netplan configuration template] ********** 2025-03-23 20:46:38.235103 | orchestrator | Sunday 23 March 2025 20:46:38 +0000 (0:00:01.274) 0:00:07.651 ********** 2025-03-23 20:46:40.106150 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-23 20:46:40.107274 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-03-23 20:46:40.107374 | orchestrator | ok: [testbed-node-3 -> localhost] 2025-03-23 20:46:40.107974 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-03-23 20:46:40.109584 | orchestrator | ok: [testbed-node-4 -> localhost] 2025-03-23 20:46:40.109723 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-03-23 20:46:40.109746 | orchestrator | ok: [testbed-node-5 -> localhost] 2025-03-23 20:46:40.109766 | orchestrator | 2025-03-23 20:46:40.110417 | orchestrator | TASK [osism.commons.network : Copy netplan configuration] ********************** 2025-03-23 20:46:40.110627 | orchestrator | Sunday 23 March 2025 20:46:40 +0000 (0:00:01.872) 0:00:09.524 ********** 2025-03-23 20:46:41.958895 | orchestrator | changed: [testbed-manager] 2025-03-23 20:46:41.959769 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:46:41.960932 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:46:41.961678 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:46:41.963110 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:46:41.965708 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:46:41.965739 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:46:41.966347 | orchestrator | 2025-03-23 20:46:41.966658 | orchestrator | TASK [osism.commons.network : Remove netplan configuration template] *********** 2025-03-23 20:46:41.967800 | orchestrator | Sunday 23 March 2025 20:46:41 +0000 (0:00:01.854) 0:00:11.378 ********** 2025-03-23 20:46:42.452978 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-23 20:46:43.205090 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-03-23 20:46:43.205717 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-03-23 20:46:43.208214 | orchestrator | ok: [testbed-node-3 -> localhost] 2025-03-23 20:46:43.209128 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-03-23 20:46:43.210443 | orchestrator | ok: [testbed-node-4 -> localhost] 2025-03-23 20:46:43.211460 | orchestrator | ok: [testbed-node-5 -> localhost] 2025-03-23 20:46:43.212694 | orchestrator | 2025-03-23 20:46:43.214215 | orchestrator | TASK [osism.commons.network : Check if path for interface file exists] ********* 2025-03-23 20:46:43.214905 | orchestrator | Sunday 23 March 2025 20:46:43 +0000 (0:00:01.248) 0:00:12.627 ********** 2025-03-23 20:46:43.703797 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:43.801378 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:44.472052 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:44.472227 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:44.472257 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:44.473150 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:44.473258 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:44.474084 | orchestrator | 2025-03-23 20:46:44.474583 | orchestrator | TASK [osism.commons.network : Copy interfaces file] **************************** 2025-03-23 20:46:44.474961 | orchestrator | Sunday 23 March 2025 20:46:44 +0000 (0:00:01.264) 0:00:13.892 ********** 2025-03-23 20:46:44.667480 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:46:44.755235 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:46:44.845673 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:46:44.937151 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:46:45.025417 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:46:45.379877 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:46:45.380712 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:46:45.381617 | orchestrator | 2025-03-23 20:46:45.383122 | orchestrator | TASK [osism.commons.network : Install package networkd-dispatcher] ************* 2025-03-23 20:46:45.385105 | orchestrator | Sunday 23 March 2025 20:46:45 +0000 (0:00:00.908) 0:00:14.801 ********** 2025-03-23 20:46:47.435243 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:47.435408 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:47.435434 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:47.437615 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:47.438106 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:47.438479 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:47.439083 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:47.439924 | orchestrator | 2025-03-23 20:46:47.440289 | orchestrator | TASK [osism.commons.network : Copy dispatcher scripts] ************************* 2025-03-23 20:46:47.440854 | orchestrator | Sunday 23 March 2025 20:46:47 +0000 (0:00:02.055) 0:00:16.856 ********** 2025-03-23 20:46:48.308131 | orchestrator | changed: [testbed-manager] => (item={'dest': 'routable.d/iptables.sh', 'src': '/opt/configuration/network/iptables.sh'}) 2025-03-23 20:46:49.503407 | orchestrator | changed: [testbed-node-0] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-23 20:46:49.504377 | orchestrator | changed: [testbed-node-1] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-23 20:46:49.506121 | orchestrator | changed: [testbed-node-2] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-23 20:46:49.507359 | orchestrator | changed: [testbed-node-3] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-23 20:46:49.509379 | orchestrator | changed: [testbed-node-4] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-23 20:46:49.511527 | orchestrator | changed: [testbed-manager] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-23 20:46:49.513216 | orchestrator | changed: [testbed-node-5] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-23 20:46:49.513956 | orchestrator | 2025-03-23 20:46:49.514988 | orchestrator | TASK [osism.commons.network : Manage service networkd-dispatcher] ************** 2025-03-23 20:46:49.516591 | orchestrator | Sunday 23 March 2025 20:46:49 +0000 (0:00:02.065) 0:00:18.922 ********** 2025-03-23 20:46:51.230743 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:51.231169 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:46:51.231214 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:46:51.231524 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:46:51.232391 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:46:51.233052 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:46:51.233533 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:46:51.236822 | orchestrator | 2025-03-23 20:46:52.767749 | orchestrator | TASK [osism.commons.network : Include cleanup tasks] *************************** 2025-03-23 20:46:52.767878 | orchestrator | Sunday 23 March 2025 20:46:51 +0000 (0:00:01.731) 0:00:20.654 ********** 2025-03-23 20:46:52.767931 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-netplan.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-23 20:46:52.768035 | orchestrator | 2025-03-23 20:46:52.768159 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2025-03-23 20:46:52.768304 | orchestrator | Sunday 23 March 2025 20:46:52 +0000 (0:00:01.534) 0:00:22.188 ********** 2025-03-23 20:46:53.350601 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:54.899620 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:54.902763 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:54.905396 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:54.905747 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:54.907142 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:54.907414 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:54.908711 | orchestrator | 2025-03-23 20:46:54.909153 | orchestrator | TASK [osism.commons.network : Set network_configured_files fact] *************** 2025-03-23 20:46:54.909195 | orchestrator | Sunday 23 March 2025 20:46:54 +0000 (0:00:02.133) 0:00:24.322 ********** 2025-03-23 20:46:55.067314 | orchestrator | ok: [testbed-manager] 2025-03-23 20:46:55.153010 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:46:55.430992 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:46:55.524385 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:46:55.613807 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:46:55.775215 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:46:55.776333 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:46:55.777444 | orchestrator | 2025-03-23 20:46:55.778367 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2025-03-23 20:46:55.779256 | orchestrator | Sunday 23 March 2025 20:46:55 +0000 (0:00:00.873) 0:00:25.195 ********** 2025-03-23 20:46:56.237863 | orchestrator | changed: [testbed-manager] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-23 20:46:56.238327 | orchestrator | skipping: [testbed-manager] => (item=/etc/netplan/01-osism.yaml)  2025-03-23 20:46:56.338183 | orchestrator | changed: [testbed-node-0] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-23 20:46:56.341888 | orchestrator | skipping: [testbed-node-0] => (item=/etc/netplan/01-osism.yaml)  2025-03-23 20:46:56.834944 | orchestrator | changed: [testbed-node-1] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-23 20:46:56.836445 | orchestrator | skipping: [testbed-node-1] => (item=/etc/netplan/01-osism.yaml)  2025-03-23 20:46:56.836540 | orchestrator | changed: [testbed-node-2] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-23 20:46:56.836661 | orchestrator | skipping: [testbed-node-2] => (item=/etc/netplan/01-osism.yaml)  2025-03-23 20:46:56.837398 | orchestrator | changed: [testbed-node-3] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-23 20:46:56.840969 | orchestrator | skipping: [testbed-node-3] => (item=/etc/netplan/01-osism.yaml)  2025-03-23 20:46:57.273582 | orchestrator | changed: [testbed-node-4] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-23 20:46:57.273657 | orchestrator | skipping: [testbed-node-4] => (item=/etc/netplan/01-osism.yaml)  2025-03-23 20:46:57.273673 | orchestrator | changed: [testbed-node-5] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-23 20:46:57.273687 | orchestrator | skipping: [testbed-node-5] => (item=/etc/netplan/01-osism.yaml)  2025-03-23 20:46:57.273701 | orchestrator | 2025-03-23 20:46:57.273716 | orchestrator | TASK [osism.commons.network : Include dummy interfaces] ************************ 2025-03-23 20:46:57.273730 | orchestrator | Sunday 23 March 2025 20:46:56 +0000 (0:00:01.064) 0:00:26.260 ********** 2025-03-23 20:46:57.273755 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:46:57.366742 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:46:57.457747 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:46:57.550928 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:46:57.658770 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:46:59.039496 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:46:59.040730 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:46:59.042579 | orchestrator | 2025-03-23 20:46:59.043744 | orchestrator | RUNNING HANDLER [osism.commons.network : Netplan configuration changed] ******** 2025-03-23 20:46:59.044689 | orchestrator | Sunday 23 March 2025 20:46:59 +0000 (0:00:02.197) 0:00:28.458 ********** 2025-03-23 20:46:59.239670 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:46:59.346901 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:46:59.649501 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:46:59.755450 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:46:59.833063 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:46:59.869804 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:46:59.870326 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:46:59.871483 | orchestrator | 2025-03-23 20:46:59.872170 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:46:59.872490 | orchestrator | 2025-03-23 20:46:59 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:46:59.872815 | orchestrator | 2025-03-23 20:46:59 | INFO  | Please wait and do not abort execution. 2025-03-23 20:46:59.873030 | orchestrator | testbed-manager : ok=16  changed=3  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 20:46:59.874439 | orchestrator | testbed-node-0 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 20:46:59.874736 | orchestrator | testbed-node-1 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 20:46:59.875852 | orchestrator | testbed-node-2 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 20:46:59.876255 | orchestrator | testbed-node-3 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 20:46:59.876798 | orchestrator | testbed-node-4 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 20:46:59.877065 | orchestrator | testbed-node-5 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 20:46:59.877403 | orchestrator | 2025-03-23 20:46:59.877897 | orchestrator | Sunday 23 March 2025 20:46:59 +0000 (0:00:00.836) 0:00:29.294 ********** 2025-03-23 20:46:59.878530 | orchestrator | =============================================================================== 2025-03-23 20:46:59.879698 | orchestrator | osism.commons.network : Include dummy interfaces ------------------------ 2.20s 2025-03-23 20:46:59.880179 | orchestrator | osism.commons.network : List existing configuration files --------------- 2.13s 2025-03-23 20:46:59.880537 | orchestrator | osism.commons.network : Install required packages ----------------------- 2.08s 2025-03-23 20:46:59.881262 | orchestrator | osism.commons.network : Copy dispatcher scripts ------------------------- 2.07s 2025-03-23 20:46:59.881955 | orchestrator | osism.commons.network : Install package networkd-dispatcher ------------- 2.06s 2025-03-23 20:46:59.882702 | orchestrator | osism.commons.network : Remove ifupdown package ------------------------- 1.90s 2025-03-23 20:46:59.883028 | orchestrator | osism.commons.network : Prepare netplan configuration template ---------- 1.87s 2025-03-23 20:46:59.883569 | orchestrator | osism.commons.network : Copy netplan configuration ---------------------- 1.85s 2025-03-23 20:46:59.884067 | orchestrator | osism.commons.network : Manage service networkd-dispatcher -------------- 1.73s 2025-03-23 20:46:59.884519 | orchestrator | osism.commons.network : Include cleanup tasks --------------------------- 1.53s 2025-03-23 20:46:59.884955 | orchestrator | osism.commons.network : Create required directories --------------------- 1.27s 2025-03-23 20:46:59.885365 | orchestrator | osism.commons.network : Check if path for interface file exists --------- 1.26s 2025-03-23 20:46:59.885839 | orchestrator | osism.commons.network : Include type specific tasks --------------------- 1.25s 2025-03-23 20:46:59.886473 | orchestrator | osism.commons.network : Remove netplan configuration template ----------- 1.25s 2025-03-23 20:46:59.886595 | orchestrator | osism.commons.network : Remove unused configuration files --------------- 1.06s 2025-03-23 20:46:59.887130 | orchestrator | osism.commons.network : Copy interfaces file ---------------------------- 0.91s 2025-03-23 20:46:59.887538 | orchestrator | osism.commons.network : Gather variables for each operating system ------ 0.89s 2025-03-23 20:46:59.887936 | orchestrator | osism.commons.network : Set network_configured_files fact --------------- 0.87s 2025-03-23 20:46:59.888365 | orchestrator | osism.commons.network : Netplan configuration changed ------------------- 0.84s 2025-03-23 20:47:00.564761 | orchestrator | + osism apply wireguard 2025-03-23 20:47:02.167079 | orchestrator | 2025-03-23 20:47:02 | INFO  | Task 667d8374-3324-402e-a3af-840007cb150c (wireguard) was prepared for execution. 2025-03-23 20:47:05.765682 | orchestrator | 2025-03-23 20:47:02 | INFO  | It takes a moment until task 667d8374-3324-402e-a3af-840007cb150c (wireguard) has been started and output is visible here. 2025-03-23 20:47:05.765817 | orchestrator | 2025-03-23 20:47:05.765892 | orchestrator | PLAY [Apply role wireguard] **************************************************** 2025-03-23 20:47:05.766330 | orchestrator | 2025-03-23 20:47:05.766612 | orchestrator | TASK [osism.services.wireguard : Install iptables package] ********************* 2025-03-23 20:47:05.766833 | orchestrator | Sunday 23 March 2025 20:47:05 +0000 (0:00:00.185) 0:00:00.185 ********** 2025-03-23 20:47:07.527249 | orchestrator | ok: [testbed-manager] 2025-03-23 20:47:07.527948 | orchestrator | 2025-03-23 20:47:07.528450 | orchestrator | TASK [osism.services.wireguard : Install wireguard package] ******************** 2025-03-23 20:47:14.868365 | orchestrator | Sunday 23 March 2025 20:47:07 +0000 (0:00:01.761) 0:00:01.947 ********** 2025-03-23 20:47:14.868494 | orchestrator | changed: [testbed-manager] 2025-03-23 20:47:14.868832 | orchestrator | 2025-03-23 20:47:14.869094 | orchestrator | TASK [osism.services.wireguard : Create public and private key - server] ******* 2025-03-23 20:47:14.869119 | orchestrator | Sunday 23 March 2025 20:47:14 +0000 (0:00:07.339) 0:00:09.286 ********** 2025-03-23 20:47:15.434969 | orchestrator | changed: [testbed-manager] 2025-03-23 20:47:15.435488 | orchestrator | 2025-03-23 20:47:15.435735 | orchestrator | TASK [osism.services.wireguard : Create preshared key] ************************* 2025-03-23 20:47:15.436199 | orchestrator | Sunday 23 March 2025 20:47:15 +0000 (0:00:00.571) 0:00:09.858 ********** 2025-03-23 20:47:15.922895 | orchestrator | changed: [testbed-manager] 2025-03-23 20:47:15.923191 | orchestrator | 2025-03-23 20:47:15.924148 | orchestrator | TASK [osism.services.wireguard : Get preshared key] **************************** 2025-03-23 20:47:15.925034 | orchestrator | Sunday 23 March 2025 20:47:15 +0000 (0:00:00.485) 0:00:10.343 ********** 2025-03-23 20:47:16.477622 | orchestrator | ok: [testbed-manager] 2025-03-23 20:47:16.479529 | orchestrator | 2025-03-23 20:47:16.480672 | orchestrator | TASK [osism.services.wireguard : Get public key - server] ********************** 2025-03-23 20:47:16.482578 | orchestrator | Sunday 23 March 2025 20:47:16 +0000 (0:00:00.555) 0:00:10.899 ********** 2025-03-23 20:47:17.071012 | orchestrator | ok: [testbed-manager] 2025-03-23 20:47:17.071844 | orchestrator | 2025-03-23 20:47:17.073367 | orchestrator | TASK [osism.services.wireguard : Get private key - server] ********************* 2025-03-23 20:47:17.073656 | orchestrator | Sunday 23 March 2025 20:47:17 +0000 (0:00:00.593) 0:00:11.493 ********** 2025-03-23 20:47:17.498157 | orchestrator | ok: [testbed-manager] 2025-03-23 20:47:17.498278 | orchestrator | 2025-03-23 20:47:17.498362 | orchestrator | TASK [osism.services.wireguard : Copy wg0.conf configuration file] ************* 2025-03-23 20:47:17.498737 | orchestrator | Sunday 23 March 2025 20:47:17 +0000 (0:00:00.427) 0:00:11.921 ********** 2025-03-23 20:47:18.773278 | orchestrator | changed: [testbed-manager] 2025-03-23 20:47:18.774367 | orchestrator | 2025-03-23 20:47:18.776228 | orchestrator | TASK [osism.services.wireguard : Copy client configuration files] ************** 2025-03-23 20:47:19.724981 | orchestrator | Sunday 23 March 2025 20:47:18 +0000 (0:00:01.273) 0:00:13.195 ********** 2025-03-23 20:47:19.725117 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-23 20:47:19.725194 | orchestrator | changed: [testbed-manager] 2025-03-23 20:47:19.726330 | orchestrator | 2025-03-23 20:47:19.728102 | orchestrator | TASK [osism.services.wireguard : Manage wg-quick@wg0.service service] ********** 2025-03-23 20:47:19.728939 | orchestrator | Sunday 23 March 2025 20:47:19 +0000 (0:00:00.950) 0:00:14.145 ********** 2025-03-23 20:47:21.545004 | orchestrator | changed: [testbed-manager] 2025-03-23 20:47:21.545941 | orchestrator | 2025-03-23 20:47:21.546937 | orchestrator | RUNNING HANDLER [osism.services.wireguard : Restart wg0 service] *************** 2025-03-23 20:47:21.547420 | orchestrator | Sunday 23 March 2025 20:47:21 +0000 (0:00:01.820) 0:00:15.966 ********** 2025-03-23 20:47:22.509243 | orchestrator | changed: [testbed-manager] 2025-03-23 20:47:22.510367 | orchestrator | 2025-03-23 20:47:22.510485 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:47:22.510544 | orchestrator | 2025-03-23 20:47:22 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:47:22.510729 | orchestrator | 2025-03-23 20:47:22 | INFO  | Please wait and do not abort execution. 2025-03-23 20:47:22.512304 | orchestrator | testbed-manager : ok=11  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:47:22.512714 | orchestrator | 2025-03-23 20:47:22.512866 | orchestrator | Sunday 23 March 2025 20:47:22 +0000 (0:00:00.966) 0:00:16.933 ********** 2025-03-23 20:47:22.513285 | orchestrator | =============================================================================== 2025-03-23 20:47:22.513601 | orchestrator | osism.services.wireguard : Install wireguard package -------------------- 7.34s 2025-03-23 20:47:22.513977 | orchestrator | osism.services.wireguard : Manage wg-quick@wg0.service service ---------- 1.82s 2025-03-23 20:47:22.514690 | orchestrator | osism.services.wireguard : Install iptables package --------------------- 1.76s 2025-03-23 20:47:22.515152 | orchestrator | osism.services.wireguard : Copy wg0.conf configuration file ------------- 1.27s 2025-03-23 20:47:22.515245 | orchestrator | osism.services.wireguard : Restart wg0 service -------------------------- 0.97s 2025-03-23 20:47:22.515721 | orchestrator | osism.services.wireguard : Copy client configuration files -------------- 0.95s 2025-03-23 20:47:22.516140 | orchestrator | osism.services.wireguard : Get public key - server ---------------------- 0.59s 2025-03-23 20:47:22.516402 | orchestrator | osism.services.wireguard : Create public and private key - server ------- 0.57s 2025-03-23 20:47:22.516810 | orchestrator | osism.services.wireguard : Get preshared key ---------------------------- 0.56s 2025-03-23 20:47:22.517232 | orchestrator | osism.services.wireguard : Create preshared key ------------------------- 0.49s 2025-03-23 20:47:22.517673 | orchestrator | osism.services.wireguard : Get private key - server --------------------- 0.43s 2025-03-23 20:47:23.178274 | orchestrator | + sh -c /opt/configuration/scripts/prepare-wireguard-configuration.sh 2025-03-23 20:47:23.217901 | orchestrator | % Total % Received % Xferd Average Speed Time Time Time Current 2025-03-23 20:47:23.305370 | orchestrator | Dload Upload Total Spent Left Speed 2025-03-23 20:47:23.305518 | orchestrator | 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 15 100 15 0 0 170 0 --:--:-- --:--:-- --:--:-- 170 100 15 100 15 0 0 170 0 --:--:-- --:--:-- --:--:-- 170 2025-03-23 20:47:23.321407 | orchestrator | + osism apply --environment custom workarounds 2025-03-23 20:47:24.925328 | orchestrator | 2025-03-23 20:47:24 | INFO  | Trying to run play workarounds in environment custom 2025-03-23 20:47:24.975644 | orchestrator | 2025-03-23 20:47:24 | INFO  | Task 221ade73-ef5b-4f8d-b9af-c90798a64363 (workarounds) was prepared for execution. 2025-03-23 20:47:28.584818 | orchestrator | 2025-03-23 20:47:24 | INFO  | It takes a moment until task 221ade73-ef5b-4f8d-b9af-c90798a64363 (workarounds) has been started and output is visible here. 2025-03-23 20:47:28.585531 | orchestrator | 2025-03-23 20:47:28.585952 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-23 20:47:28.585990 | orchestrator | 2025-03-23 20:47:28.590481 | orchestrator | TASK [Group hosts based on virtualization_role] ******************************** 2025-03-23 20:47:28.590746 | orchestrator | Sunday 23 March 2025 20:47:28 +0000 (0:00:00.155) 0:00:00.155 ********** 2025-03-23 20:47:28.785353 | orchestrator | changed: [testbed-manager] => (item=virtualization_role_guest) 2025-03-23 20:47:28.877702 | orchestrator | changed: [testbed-node-3] => (item=virtualization_role_guest) 2025-03-23 20:47:28.975920 | orchestrator | changed: [testbed-node-4] => (item=virtualization_role_guest) 2025-03-23 20:47:29.066852 | orchestrator | changed: [testbed-node-5] => (item=virtualization_role_guest) 2025-03-23 20:47:29.156052 | orchestrator | changed: [testbed-node-0] => (item=virtualization_role_guest) 2025-03-23 20:47:29.459245 | orchestrator | changed: [testbed-node-1] => (item=virtualization_role_guest) 2025-03-23 20:47:29.460463 | orchestrator | changed: [testbed-node-2] => (item=virtualization_role_guest) 2025-03-23 20:47:29.460500 | orchestrator | 2025-03-23 20:47:29.462786 | orchestrator | PLAY [Apply netplan configuration on the manager node] ************************* 2025-03-23 20:47:29.464340 | orchestrator | 2025-03-23 20:47:32.395225 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2025-03-23 20:47:32.395375 | orchestrator | Sunday 23 March 2025 20:47:29 +0000 (0:00:00.873) 0:00:01.029 ********** 2025-03-23 20:47:32.395415 | orchestrator | ok: [testbed-manager] 2025-03-23 20:47:32.397429 | orchestrator | 2025-03-23 20:47:32.403799 | orchestrator | PLAY [Apply netplan configuration on all other nodes] ************************** 2025-03-23 20:47:32.406272 | orchestrator | 2025-03-23 20:47:32.406758 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2025-03-23 20:47:32.407708 | orchestrator | Sunday 23 March 2025 20:47:32 +0000 (0:00:02.933) 0:00:03.962 ********** 2025-03-23 20:47:34.380079 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:47:34.382967 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:47:34.383172 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:47:34.383201 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:47:34.383215 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:47:34.383230 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:47:34.383249 | orchestrator | 2025-03-23 20:47:34.383437 | orchestrator | PLAY [Add custom CA certificates to non-manager nodes] ************************* 2025-03-23 20:47:34.384370 | orchestrator | 2025-03-23 20:47:34.385091 | orchestrator | TASK [Copy custom CA certificates] ********************************************* 2025-03-23 20:47:34.385790 | orchestrator | Sunday 23 March 2025 20:47:34 +0000 (0:00:01.985) 0:00:05.948 ********** 2025-03-23 20:47:35.906390 | orchestrator | changed: [testbed-node-3] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-23 20:47:35.907716 | orchestrator | changed: [testbed-node-4] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-23 20:47:35.910002 | orchestrator | changed: [testbed-node-5] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-23 20:47:35.910894 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-23 20:47:35.910931 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-23 20:47:35.912106 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-23 20:47:35.914661 | orchestrator | 2025-03-23 20:47:35.919285 | orchestrator | TASK [Run update-ca-certificates] ********************************************** 2025-03-23 20:47:35.921323 | orchestrator | Sunday 23 March 2025 20:47:35 +0000 (0:00:01.525) 0:00:07.474 ********** 2025-03-23 20:47:39.154177 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:47:39.154372 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:47:39.156256 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:47:39.156727 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:47:39.160022 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:47:39.160729 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:47:39.162357 | orchestrator | 2025-03-23 20:47:39.163053 | orchestrator | TASK [Run update-ca-trust] ***************************************************** 2025-03-23 20:47:39.163925 | orchestrator | Sunday 23 March 2025 20:47:39 +0000 (0:00:03.251) 0:00:10.725 ********** 2025-03-23 20:47:39.328858 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:47:39.426733 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:47:39.513963 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:47:39.798793 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:47:39.954462 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:47:39.954902 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:47:39.955986 | orchestrator | 2025-03-23 20:47:39.956304 | orchestrator | PLAY [Add a workaround service] ************************************************ 2025-03-23 20:47:39.959090 | orchestrator | 2025-03-23 20:47:39.959829 | orchestrator | TASK [Copy workarounds.sh scripts] ********************************************* 2025-03-23 20:47:39.960408 | orchestrator | Sunday 23 March 2025 20:47:39 +0000 (0:00:00.800) 0:00:11.526 ********** 2025-03-23 20:47:41.837503 | orchestrator | changed: [testbed-manager] 2025-03-23 20:47:41.837718 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:47:41.838967 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:47:41.839810 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:47:41.841895 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:47:41.842386 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:47:41.843546 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:47:41.844053 | orchestrator | 2025-03-23 20:47:41.844607 | orchestrator | TASK [Copy workarounds systemd unit file] ************************************** 2025-03-23 20:47:41.845096 | orchestrator | Sunday 23 March 2025 20:47:41 +0000 (0:00:01.882) 0:00:13.408 ********** 2025-03-23 20:47:43.600100 | orchestrator | changed: [testbed-manager] 2025-03-23 20:47:43.600844 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:47:43.602295 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:47:43.603412 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:47:43.603789 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:47:43.604738 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:47:43.605395 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:47:43.605939 | orchestrator | 2025-03-23 20:47:43.606399 | orchestrator | TASK [Reload systemd daemon] *************************************************** 2025-03-23 20:47:43.607308 | orchestrator | Sunday 23 March 2025 20:47:43 +0000 (0:00:01.756) 0:00:15.166 ********** 2025-03-23 20:47:45.267435 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:47:45.267733 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:47:45.268519 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:47:45.270747 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:47:45.273142 | orchestrator | ok: [testbed-manager] 2025-03-23 20:47:45.274093 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:47:45.274977 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:47:45.275805 | orchestrator | 2025-03-23 20:47:45.276679 | orchestrator | TASK [Enable workarounds.service (Debian)] ************************************* 2025-03-23 20:47:45.277440 | orchestrator | Sunday 23 March 2025 20:47:45 +0000 (0:00:01.672) 0:00:16.839 ********** 2025-03-23 20:47:47.194271 | orchestrator | changed: [testbed-manager] 2025-03-23 20:47:47.197010 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:47:47.197162 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:47:47.197923 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:47:47.199184 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:47:47.199966 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:47:47.200879 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:47:47.202891 | orchestrator | 2025-03-23 20:47:47.203738 | orchestrator | TASK [Enable and start workarounds.service (RedHat)] *************************** 2025-03-23 20:47:47.204986 | orchestrator | Sunday 23 March 2025 20:47:47 +0000 (0:00:01.925) 0:00:18.765 ********** 2025-03-23 20:47:47.368960 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:47:47.462094 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:47:47.547036 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:47:47.629414 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:47:47.927043 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:47:48.099467 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:47:48.100224 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:47:48.101355 | orchestrator | 2025-03-23 20:47:48.102401 | orchestrator | PLAY [On Ubuntu 24.04 install python3-docker from Debian Sid] ****************** 2025-03-23 20:47:48.103726 | orchestrator | 2025-03-23 20:47:48.104429 | orchestrator | TASK [Install python3-docker] ************************************************** 2025-03-23 20:47:48.105319 | orchestrator | Sunday 23 March 2025 20:47:48 +0000 (0:00:00.905) 0:00:19.670 ********** 2025-03-23 20:47:51.245217 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:47:51.246635 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:47:51.246683 | orchestrator | ok: [testbed-manager] 2025-03-23 20:47:51.247010 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:47:51.249065 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:47:51.249760 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:47:51.250597 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:47:51.251142 | orchestrator | 2025-03-23 20:47:51.251998 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:47:51.252444 | orchestrator | 2025-03-23 20:47:51 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:47:51.252514 | orchestrator | 2025-03-23 20:47:51 | INFO  | Please wait and do not abort execution. 2025-03-23 20:47:51.253445 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:47:51.255027 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:47:51.255819 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:47:51.256517 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:47:51.257530 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:47:51.258011 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:47:51.258750 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:47:51.259387 | orchestrator | 2025-03-23 20:47:51.259854 | orchestrator | Sunday 23 March 2025 20:47:51 +0000 (0:00:03.145) 0:00:22.816 ********** 2025-03-23 20:47:51.260586 | orchestrator | =============================================================================== 2025-03-23 20:47:51.261098 | orchestrator | Run update-ca-certificates ---------------------------------------------- 3.25s 2025-03-23 20:47:51.261875 | orchestrator | Install python3-docker -------------------------------------------------- 3.15s 2025-03-23 20:47:51.262128 | orchestrator | Apply netplan configuration --------------------------------------------- 2.93s 2025-03-23 20:47:51.262601 | orchestrator | Apply netplan configuration --------------------------------------------- 1.99s 2025-03-23 20:47:51.262873 | orchestrator | Enable workarounds.service (Debian) ------------------------------------- 1.93s 2025-03-23 20:47:51.263596 | orchestrator | Copy workarounds.sh scripts --------------------------------------------- 1.88s 2025-03-23 20:47:51.264264 | orchestrator | Copy workarounds systemd unit file -------------------------------------- 1.76s 2025-03-23 20:47:51.264683 | orchestrator | Reload systemd daemon --------------------------------------------------- 1.67s 2025-03-23 20:47:51.265207 | orchestrator | Copy custom CA certificates --------------------------------------------- 1.53s 2025-03-23 20:47:51.265744 | orchestrator | Enable and start workarounds.service (RedHat) --------------------------- 0.91s 2025-03-23 20:47:51.266217 | orchestrator | Group hosts based on virtualization_role -------------------------------- 0.87s 2025-03-23 20:47:51.267272 | orchestrator | Run update-ca-trust ----------------------------------------------------- 0.80s 2025-03-23 20:47:51.894341 | orchestrator | + osism apply reboot -l testbed-nodes -e ireallymeanit=yes 2025-03-23 20:47:53.439875 | orchestrator | 2025-03-23 20:47:53 | INFO  | Task 62d7b368-68ab-4d59-bca7-6f8b3573808b (reboot) was prepared for execution. 2025-03-23 20:47:56.753989 | orchestrator | 2025-03-23 20:47:53 | INFO  | It takes a moment until task 62d7b368-68ab-4d59-bca7-6f8b3573808b (reboot) has been started and output is visible here. 2025-03-23 20:47:56.754210 | orchestrator | 2025-03-23 20:47:56.754374 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-23 20:47:56.755808 | orchestrator | 2025-03-23 20:47:56.756538 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-23 20:47:56.757400 | orchestrator | Sunday 23 March 2025 20:47:56 +0000 (0:00:00.155) 0:00:00.155 ********** 2025-03-23 20:47:56.866903 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:47:56.867853 | orchestrator | 2025-03-23 20:47:56.870882 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-23 20:47:56.872344 | orchestrator | Sunday 23 March 2025 20:47:56 +0000 (0:00:00.115) 0:00:00.270 ********** 2025-03-23 20:47:57.900009 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:47:57.900626 | orchestrator | 2025-03-23 20:47:57.901089 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-23 20:47:57.901120 | orchestrator | Sunday 23 March 2025 20:47:57 +0000 (0:00:01.030) 0:00:01.301 ********** 2025-03-23 20:47:58.018217 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:47:58.018349 | orchestrator | 2025-03-23 20:47:58.018676 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-23 20:47:58.019650 | orchestrator | 2025-03-23 20:47:58.020219 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-23 20:47:58.020696 | orchestrator | Sunday 23 March 2025 20:47:58 +0000 (0:00:00.119) 0:00:01.420 ********** 2025-03-23 20:47:58.118873 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:47:58.119610 | orchestrator | 2025-03-23 20:47:58.119649 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-23 20:47:58.773207 | orchestrator | Sunday 23 March 2025 20:47:58 +0000 (0:00:00.103) 0:00:01.523 ********** 2025-03-23 20:47:58.773296 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:47:58.773887 | orchestrator | 2025-03-23 20:47:58.775107 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-23 20:47:58.775832 | orchestrator | Sunday 23 March 2025 20:47:58 +0000 (0:00:00.652) 0:00:02.176 ********** 2025-03-23 20:47:58.891026 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:47:58.892056 | orchestrator | 2025-03-23 20:47:58.893251 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-23 20:47:58.893756 | orchestrator | 2025-03-23 20:47:58.894721 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-23 20:47:58.895039 | orchestrator | Sunday 23 March 2025 20:47:58 +0000 (0:00:00.116) 0:00:02.293 ********** 2025-03-23 20:47:58.982472 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:47:58.982643 | orchestrator | 2025-03-23 20:47:58.983417 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-23 20:47:58.984156 | orchestrator | Sunday 23 March 2025 20:47:58 +0000 (0:00:00.093) 0:00:02.386 ********** 2025-03-23 20:47:59.877356 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:47:59.877496 | orchestrator | 2025-03-23 20:47:59.877519 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-23 20:47:59.877881 | orchestrator | Sunday 23 March 2025 20:47:59 +0000 (0:00:00.891) 0:00:03.278 ********** 2025-03-23 20:47:59.993357 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:47:59.993679 | orchestrator | 2025-03-23 20:47:59.994862 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-23 20:47:59.995345 | orchestrator | 2025-03-23 20:47:59.997176 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-23 20:47:59.998114 | orchestrator | Sunday 23 March 2025 20:47:59 +0000 (0:00:00.117) 0:00:03.395 ********** 2025-03-23 20:48:00.095905 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:48:00.096108 | orchestrator | 2025-03-23 20:48:00.098081 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-23 20:48:00.923526 | orchestrator | Sunday 23 March 2025 20:48:00 +0000 (0:00:00.103) 0:00:03.499 ********** 2025-03-23 20:48:00.923719 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:48:00.924860 | orchestrator | 2025-03-23 20:48:00.925279 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-23 20:48:00.926142 | orchestrator | Sunday 23 March 2025 20:48:00 +0000 (0:00:00.825) 0:00:04.324 ********** 2025-03-23 20:48:01.052247 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:48:01.053953 | orchestrator | 2025-03-23 20:48:01.054661 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-23 20:48:01.056441 | orchestrator | 2025-03-23 20:48:01.057276 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-23 20:48:01.057307 | orchestrator | Sunday 23 March 2025 20:48:01 +0000 (0:00:00.127) 0:00:04.451 ********** 2025-03-23 20:48:01.171829 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:48:01.172895 | orchestrator | 2025-03-23 20:48:01.173243 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-23 20:48:01.173274 | orchestrator | Sunday 23 March 2025 20:48:01 +0000 (0:00:00.123) 0:00:04.575 ********** 2025-03-23 20:48:01.947693 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:48:01.952063 | orchestrator | 2025-03-23 20:48:02.101815 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-23 20:48:02.101911 | orchestrator | Sunday 23 March 2025 20:48:01 +0000 (0:00:00.773) 0:00:05.349 ********** 2025-03-23 20:48:02.101939 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:48:02.102002 | orchestrator | 2025-03-23 20:48:02.102930 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-23 20:48:02.103475 | orchestrator | 2025-03-23 20:48:02.104178 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-23 20:48:02.107091 | orchestrator | Sunday 23 March 2025 20:48:02 +0000 (0:00:00.152) 0:00:05.502 ********** 2025-03-23 20:48:02.207720 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:48:02.208279 | orchestrator | 2025-03-23 20:48:02.209655 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-23 20:48:02.869923 | orchestrator | Sunday 23 March 2025 20:48:02 +0000 (0:00:00.110) 0:00:05.612 ********** 2025-03-23 20:48:02.870074 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:48:02.870134 | orchestrator | 2025-03-23 20:48:02.870153 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-23 20:48:02.871178 | orchestrator | Sunday 23 March 2025 20:48:02 +0000 (0:00:00.661) 0:00:06.274 ********** 2025-03-23 20:48:02.910312 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:48:02.910373 | orchestrator | 2025-03-23 20:48:02.911104 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:48:02.911378 | orchestrator | 2025-03-23 20:48:02 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:48:02.911516 | orchestrator | 2025-03-23 20:48:02 | INFO  | Please wait and do not abort execution. 2025-03-23 20:48:02.912074 | orchestrator | testbed-node-0 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:48:02.912731 | orchestrator | testbed-node-1 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:48:02.913186 | orchestrator | testbed-node-2 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:48:02.913693 | orchestrator | testbed-node-3 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:48:02.915159 | orchestrator | testbed-node-4 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:48:02.915552 | orchestrator | testbed-node-5 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:48:02.916135 | orchestrator | 2025-03-23 20:48:02.916665 | orchestrator | Sunday 23 March 2025 20:48:02 +0000 (0:00:00.041) 0:00:06.315 ********** 2025-03-23 20:48:02.917659 | orchestrator | =============================================================================== 2025-03-23 20:48:02.918413 | orchestrator | Reboot system - do not wait for the reboot to complete ------------------ 4.84s 2025-03-23 20:48:02.919303 | orchestrator | Reboot system - wait for the reboot to complete ------------------------- 0.68s 2025-03-23 20:48:02.919951 | orchestrator | Exit playbook, if user did not mean to reboot systems ------------------- 0.65s 2025-03-23 20:48:03.572531 | orchestrator | + osism apply wait-for-connection -l testbed-nodes -e ireallymeanit=yes 2025-03-23 20:48:05.219235 | orchestrator | 2025-03-23 20:48:05 | INFO  | Task 14d79e56-6e75-462f-a5c1-c082bcde6deb (wait-for-connection) was prepared for execution. 2025-03-23 20:48:08.867199 | orchestrator | 2025-03-23 20:48:05 | INFO  | It takes a moment until task 14d79e56-6e75-462f-a5c1-c082bcde6deb (wait-for-connection) has been started and output is visible here. 2025-03-23 20:48:08.867455 | orchestrator | 2025-03-23 20:48:08.867549 | orchestrator | PLAY [Wait until remote systems are reachable] ********************************* 2025-03-23 20:48:08.868869 | orchestrator | 2025-03-23 20:48:08.869279 | orchestrator | TASK [Wait until remote system is reachable] *********************************** 2025-03-23 20:48:08.874216 | orchestrator | Sunday 23 March 2025 20:48:08 +0000 (0:00:00.207) 0:00:00.207 ********** 2025-03-23 20:48:21.813157 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:48:21.813344 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:48:21.813386 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:48:21.813927 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:48:21.813966 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:48:21.813980 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:48:21.813994 | orchestrator | 2025-03-23 20:48:21.814066 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:48:21.815033 | orchestrator | 2025-03-23 20:48:21 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:48:21.815320 | orchestrator | 2025-03-23 20:48:21 | INFO  | Please wait and do not abort execution. 2025-03-23 20:48:21.816780 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:48:21.817524 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:48:21.818597 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:48:21.818829 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:48:21.819173 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:48:21.819428 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:48:21.819809 | orchestrator | 2025-03-23 20:48:21.820165 | orchestrator | Sunday 23 March 2025 20:48:21 +0000 (0:00:12.941) 0:00:13.148 ********** 2025-03-23 20:48:21.820504 | orchestrator | =============================================================================== 2025-03-23 20:48:21.820858 | orchestrator | Wait until remote system is reachable ---------------------------------- 12.94s 2025-03-23 20:48:22.447053 | orchestrator | + osism apply hddtemp 2025-03-23 20:48:24.035403 | orchestrator | 2025-03-23 20:48:24 | INFO  | Task f15e653b-a31e-4f08-aedc-b89fcd77717e (hddtemp) was prepared for execution. 2025-03-23 20:48:27.628253 | orchestrator | 2025-03-23 20:48:24 | INFO  | It takes a moment until task f15e653b-a31e-4f08-aedc-b89fcd77717e (hddtemp) has been started and output is visible here. 2025-03-23 20:48:27.628390 | orchestrator | 2025-03-23 20:48:27.628769 | orchestrator | PLAY [Apply role hddtemp] ****************************************************** 2025-03-23 20:48:27.628800 | orchestrator | 2025-03-23 20:48:27.628821 | orchestrator | TASK [osism.services.hddtemp : Gather variables for each operating system] ***** 2025-03-23 20:48:27.630565 | orchestrator | Sunday 23 March 2025 20:48:27 +0000 (0:00:00.265) 0:00:00.265 ********** 2025-03-23 20:48:27.791016 | orchestrator | ok: [testbed-manager] 2025-03-23 20:48:27.885982 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:48:27.978648 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:48:28.073087 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:48:28.157395 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:48:28.405847 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:48:28.407002 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:48:28.407711 | orchestrator | 2025-03-23 20:48:28.408470 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific install tasks] **** 2025-03-23 20:48:28.409322 | orchestrator | Sunday 23 March 2025 20:48:28 +0000 (0:00:00.782) 0:00:01.047 ********** 2025-03-23 20:48:29.789365 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-23 20:48:29.790649 | orchestrator | 2025-03-23 20:48:29.792371 | orchestrator | TASK [osism.services.hddtemp : Remove hddtemp package] ************************* 2025-03-23 20:48:29.793085 | orchestrator | Sunday 23 March 2025 20:48:29 +0000 (0:00:01.378) 0:00:02.426 ********** 2025-03-23 20:48:32.116052 | orchestrator | ok: [testbed-manager] 2025-03-23 20:48:32.117311 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:48:32.117519 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:48:32.119523 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:48:32.121412 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:48:32.121796 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:48:32.122436 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:48:32.123510 | orchestrator | 2025-03-23 20:48:32.124344 | orchestrator | TASK [osism.services.hddtemp : Enable Kernel Module drivetemp] ***************** 2025-03-23 20:48:32.125222 | orchestrator | Sunday 23 March 2025 20:48:32 +0000 (0:00:02.330) 0:00:04.756 ********** 2025-03-23 20:48:32.784235 | orchestrator | changed: [testbed-manager] 2025-03-23 20:48:32.876440 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:48:33.339064 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:48:33.340162 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:48:33.341238 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:48:33.342614 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:48:33.343765 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:48:33.344898 | orchestrator | 2025-03-23 20:48:33.346761 | orchestrator | TASK [osism.services.hddtemp : Check if drivetemp module is available] ********* 2025-03-23 20:48:33.348500 | orchestrator | Sunday 23 March 2025 20:48:33 +0000 (0:00:01.216) 0:00:05.973 ********** 2025-03-23 20:48:34.785402 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:48:34.786274 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:48:34.787860 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:48:34.788700 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:48:34.789470 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:48:34.790629 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:48:34.791691 | orchestrator | ok: [testbed-manager] 2025-03-23 20:48:34.792164 | orchestrator | 2025-03-23 20:48:34.793246 | orchestrator | TASK [osism.services.hddtemp : Load Kernel Module drivetemp] ******************* 2025-03-23 20:48:34.793632 | orchestrator | Sunday 23 March 2025 20:48:34 +0000 (0:00:01.452) 0:00:07.425 ********** 2025-03-23 20:48:35.082183 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:48:35.187829 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:48:35.277969 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:48:35.395421 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:48:35.533636 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:48:35.534192 | orchestrator | changed: [testbed-manager] 2025-03-23 20:48:35.534842 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:48:35.535541 | orchestrator | 2025-03-23 20:48:35.536705 | orchestrator | TASK [osism.services.hddtemp : Install lm-sensors] ***************************** 2025-03-23 20:48:35.537224 | orchestrator | Sunday 23 March 2025 20:48:35 +0000 (0:00:00.752) 0:00:08.178 ********** 2025-03-23 20:48:49.242561 | orchestrator | changed: [testbed-manager] 2025-03-23 20:48:49.242937 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:48:49.242980 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:48:49.244966 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:48:49.245996 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:48:49.246678 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:48:49.247344 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:48:49.248190 | orchestrator | 2025-03-23 20:48:49.248682 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific service tasks] **** 2025-03-23 20:48:49.249401 | orchestrator | Sunday 23 March 2025 20:48:49 +0000 (0:00:13.700) 0:00:21.878 ********** 2025-03-23 20:48:50.557993 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/service-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-23 20:48:50.561229 | orchestrator | 2025-03-23 20:48:52.544222 | orchestrator | TASK [osism.services.hddtemp : Manage lm-sensors service] ********************** 2025-03-23 20:48:52.544341 | orchestrator | Sunday 23 March 2025 20:48:50 +0000 (0:00:01.317) 0:00:23.196 ********** 2025-03-23 20:48:52.544375 | orchestrator | changed: [testbed-manager] 2025-03-23 20:48:52.545069 | orchestrator | changed: [testbed-node-1] 2025-03-23 20:48:52.546388 | orchestrator | changed: [testbed-node-0] 2025-03-23 20:48:52.546422 | orchestrator | changed: [testbed-node-2] 2025-03-23 20:48:52.547217 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:48:52.548915 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:48:52.549835 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:48:52.551160 | orchestrator | 2025-03-23 20:48:52.552283 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:48:52.553867 | orchestrator | 2025-03-23 20:48:52 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:48:52.555116 | orchestrator | 2025-03-23 20:48:52 | INFO  | Please wait and do not abort execution. 2025-03-23 20:48:52.555177 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:48:52.555919 | orchestrator | testbed-node-0 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:48:52.556775 | orchestrator | testbed-node-1 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:48:52.557608 | orchestrator | testbed-node-2 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:48:52.559132 | orchestrator | testbed-node-3 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:48:52.559454 | orchestrator | testbed-node-4 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:48:52.561559 | orchestrator | testbed-node-5 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:48:52.562456 | orchestrator | 2025-03-23 20:48:52.563387 | orchestrator | Sunday 23 March 2025 20:48:52 +0000 (0:00:01.991) 0:00:25.187 ********** 2025-03-23 20:48:52.563793 | orchestrator | =============================================================================== 2025-03-23 20:48:52.564562 | orchestrator | osism.services.hddtemp : Install lm-sensors ---------------------------- 13.70s 2025-03-23 20:48:52.565348 | orchestrator | osism.services.hddtemp : Remove hddtemp package ------------------------- 2.33s 2025-03-23 20:48:52.566212 | orchestrator | osism.services.hddtemp : Manage lm-sensors service ---------------------- 1.99s 2025-03-23 20:48:52.566957 | orchestrator | osism.services.hddtemp : Check if drivetemp module is available --------- 1.45s 2025-03-23 20:48:52.567602 | orchestrator | osism.services.hddtemp : Include distribution specific install tasks ---- 1.38s 2025-03-23 20:48:52.568569 | orchestrator | osism.services.hddtemp : Include distribution specific service tasks ---- 1.32s 2025-03-23 20:48:52.569459 | orchestrator | osism.services.hddtemp : Enable Kernel Module drivetemp ----------------- 1.22s 2025-03-23 20:48:52.570325 | orchestrator | osism.services.hddtemp : Gather variables for each operating system ----- 0.78s 2025-03-23 20:48:52.571030 | orchestrator | osism.services.hddtemp : Load Kernel Module drivetemp ------------------- 0.75s 2025-03-23 20:48:53.263663 | orchestrator | + sudo systemctl restart docker-compose@manager 2025-03-23 20:48:54.571676 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2025-03-23 20:48:54.571866 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2025-03-23 20:48:54.571891 | orchestrator | + local max_attempts=60 2025-03-23 20:48:54.571907 | orchestrator | + local name=ceph-ansible 2025-03-23 20:48:54.571922 | orchestrator | + local attempt_num=1 2025-03-23 20:48:54.571942 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-03-23 20:48:54.604404 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-23 20:48:54.605479 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2025-03-23 20:48:54.605508 | orchestrator | + local max_attempts=60 2025-03-23 20:48:54.605523 | orchestrator | + local name=kolla-ansible 2025-03-23 20:48:54.605536 | orchestrator | + local attempt_num=1 2025-03-23 20:48:54.605555 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2025-03-23 20:48:54.633877 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-23 20:48:54.635416 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2025-03-23 20:48:54.635445 | orchestrator | + local max_attempts=60 2025-03-23 20:48:54.635460 | orchestrator | + local name=osism-ansible 2025-03-23 20:48:54.635475 | orchestrator | + local attempt_num=1 2025-03-23 20:48:54.635495 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2025-03-23 20:48:54.672040 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-23 20:48:55.067344 | orchestrator | + [[ true == \t\r\u\e ]] 2025-03-23 20:48:55.067448 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2025-03-23 20:48:55.067475 | orchestrator | ARA in ceph-ansible already disabled. 2025-03-23 20:48:55.464219 | orchestrator | ARA in kolla-ansible already disabled. 2025-03-23 20:48:55.754848 | orchestrator | ARA in osism-ansible already disabled. 2025-03-23 20:48:56.104868 | orchestrator | ARA in osism-kubernetes already disabled. 2025-03-23 20:48:56.105041 | orchestrator | + osism apply gather-facts 2025-03-23 20:48:57.694324 | orchestrator | 2025-03-23 20:48:57 | INFO  | Task 91d81048-b7a3-4a19-907a-08639785b5e8 (gather-facts) was prepared for execution. 2025-03-23 20:49:00.990939 | orchestrator | 2025-03-23 20:48:57 | INFO  | It takes a moment until task 91d81048-b7a3-4a19-907a-08639785b5e8 (gather-facts) has been started and output is visible here. 2025-03-23 20:49:00.991083 | orchestrator | 2025-03-23 20:49:00.991202 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-03-23 20:49:00.992683 | orchestrator | 2025-03-23 20:49:00.993516 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-23 20:49:00.994103 | orchestrator | Sunday 23 March 2025 20:49:00 +0000 (0:00:00.187) 0:00:00.187 ********** 2025-03-23 20:49:06.119735 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:49:06.120623 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:49:06.120667 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:49:06.123099 | orchestrator | ok: [testbed-manager] 2025-03-23 20:49:06.125516 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:49:06.126398 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:49:06.127140 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:49:06.127792 | orchestrator | 2025-03-23 20:49:06.128202 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-03-23 20:49:06.128742 | orchestrator | 2025-03-23 20:49:06.131405 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-03-23 20:49:06.131833 | orchestrator | Sunday 23 March 2025 20:49:06 +0000 (0:00:05.133) 0:00:05.320 ********** 2025-03-23 20:49:06.306066 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:49:06.401790 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:49:06.494201 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:49:06.578111 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:49:06.658137 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:49:06.701822 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:49:06.702319 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:49:06.702924 | orchestrator | 2025-03-23 20:49:06.704210 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:49:06.705250 | orchestrator | 2025-03-23 20:49:06 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:49:06.705714 | orchestrator | 2025-03-23 20:49:06 | INFO  | Please wait and do not abort execution. 2025-03-23 20:49:06.706914 | orchestrator | testbed-manager : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:49:06.707688 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:49:06.708570 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:49:06.709376 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:49:06.710264 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:49:06.711116 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:49:06.712220 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-23 20:49:06.712917 | orchestrator | 2025-03-23 20:49:06.713870 | orchestrator | Sunday 23 March 2025 20:49:06 +0000 (0:00:00.583) 0:00:05.904 ********** 2025-03-23 20:49:06.714728 | orchestrator | =============================================================================== 2025-03-23 20:49:06.715601 | orchestrator | Gathers facts about hosts ----------------------------------------------- 5.13s 2025-03-23 20:49:06.716505 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.58s 2025-03-23 20:49:07.362009 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/001-helpers.sh /usr/local/bin/deploy-helper 2025-03-23 20:49:07.382703 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/500-kubernetes.sh /usr/local/bin/deploy-kubernetes 2025-03-23 20:49:07.403666 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/510-clusterapi.sh /usr/local/bin/deploy-kubernetes-clusterapi 2025-03-23 20:49:07.428059 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-ansible.sh /usr/local/bin/deploy-ceph-with-ansible 2025-03-23 20:49:07.447559 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-rook.sh /usr/local/bin/deploy-ceph-with-rook 2025-03-23 20:49:07.467999 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/200-infrastructure.sh /usr/local/bin/deploy-infrastructure 2025-03-23 20:49:07.497759 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/300-openstack.sh /usr/local/bin/deploy-openstack 2025-03-23 20:49:07.518706 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/400-monitoring.sh /usr/local/bin/deploy-monitoring 2025-03-23 20:49:07.539209 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/500-kubernetes.sh /usr/local/bin/upgrade-kubernetes 2025-03-23 20:49:07.558761 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/510-clusterapi.sh /usr/local/bin/upgrade-kubernetes-clusterapi 2025-03-23 20:49:07.577126 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-ansible.sh /usr/local/bin/upgrade-ceph-with-ansible 2025-03-23 20:49:07.597944 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-rook.sh /usr/local/bin/upgrade-ceph-with-rook 2025-03-23 20:49:07.619227 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/200-infrastructure.sh /usr/local/bin/upgrade-infrastructure 2025-03-23 20:49:07.635308 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/300-openstack.sh /usr/local/bin/upgrade-openstack 2025-03-23 20:49:07.649225 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/400-monitoring.sh /usr/local/bin/upgrade-monitoring 2025-03-23 20:49:07.663392 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/300-openstack.sh /usr/local/bin/bootstrap-openstack 2025-03-23 20:49:07.678942 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/301-openstack-octavia-amhpora-image.sh /usr/local/bin/bootstrap-octavia 2025-03-23 20:49:07.695322 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/302-openstack-k8s-clusterapi-images.sh /usr/local/bin/bootstrap-clusterapi 2025-03-23 20:49:07.709245 | orchestrator | + sudo ln -sf /opt/configuration/scripts/disable-local-registry.sh /usr/local/bin/disable-local-registry 2025-03-23 20:49:07.728521 | orchestrator | + sudo ln -sf /opt/configuration/scripts/pull-images.sh /usr/local/bin/pull-images 2025-03-23 20:49:07.741879 | orchestrator | + [[ false == \t\r\u\e ]] 2025-03-23 20:49:07.900650 | orchestrator | changed 2025-03-23 20:49:07.953994 | 2025-03-23 20:49:07.954144 | TASK [Deploy services] 2025-03-23 20:49:08.080861 | orchestrator | skipping: Conditional result was False 2025-03-23 20:49:08.095369 | 2025-03-23 20:49:08.095478 | TASK [Deploy in a nutshell] 2025-03-23 20:49:08.770010 | orchestrator | + set -e 2025-03-23 20:49:08.770203 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-03-23 20:49:08.770222 | orchestrator | ++ export INTERACTIVE=false 2025-03-23 20:49:08.770234 | orchestrator | ++ INTERACTIVE=false 2025-03-23 20:49:08.770274 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-03-23 20:49:08.770286 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-03-23 20:49:08.770297 | orchestrator | + source /opt/manager-vars.sh 2025-03-23 20:49:08.770317 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-03-23 20:49:08.770419 | orchestrator | ++ NUMBER_OF_NODES=6 2025-03-23 20:49:08.770432 | orchestrator | ++ export CEPH_VERSION=quincy 2025-03-23 20:49:08.770441 | orchestrator | ++ CEPH_VERSION=quincy 2025-03-23 20:49:08.770450 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-03-23 20:49:08.770459 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-03-23 20:49:08.770481 | orchestrator | ++ export MANAGER_VERSION=8.1.0 2025-03-23 20:49:08.770490 | orchestrator | ++ MANAGER_VERSION=8.1.0 2025-03-23 20:49:08.770500 | orchestrator | ++ export OPENSTACK_VERSION=2024.1 2025-03-23 20:49:08.770509 | orchestrator | ++ OPENSTACK_VERSION=2024.1 2025-03-23 20:49:08.770518 | orchestrator | ++ export ARA=false 2025-03-23 20:49:08.770527 | orchestrator | ++ ARA=false 2025-03-23 20:49:08.770548 | orchestrator | ++ export TEMPEST=false 2025-03-23 20:49:08.770557 | orchestrator | ++ TEMPEST=false 2025-03-23 20:49:08.770565 | orchestrator | ++ export IS_ZUUL=true 2025-03-23 20:49:08.770573 | orchestrator | ++ IS_ZUUL=true 2025-03-23 20:49:08.770596 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.116 2025-03-23 20:49:08.770610 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.116 2025-03-23 20:49:08.770808 | orchestrator | ++ export EXTERNAL_API=false 2025-03-23 20:49:08.770827 | orchestrator | ++ EXTERNAL_API=false 2025-03-23 20:49:08.770837 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-03-23 20:49:08.770846 | orchestrator | ++ IMAGE_USER=ubuntu 2025-03-23 20:49:08.770855 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-03-23 20:49:08.770865 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-03-23 20:49:08.770875 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-03-23 20:49:08.770890 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-03-23 20:49:08.770904 | orchestrator | 2025-03-23 20:49:08.772838 | orchestrator | # PULL IMAGES 2025-03-23 20:49:08.772878 | orchestrator | 2025-03-23 20:49:08.772887 | orchestrator | + echo 2025-03-23 20:49:08.772897 | orchestrator | + echo '# PULL IMAGES' 2025-03-23 20:49:08.772906 | orchestrator | + echo 2025-03-23 20:49:08.772923 | orchestrator | ++ semver 8.1.0 7.0.0 2025-03-23 20:49:08.833290 | orchestrator | + [[ 1 -ge 0 ]] 2025-03-23 20:49:10.391177 | orchestrator | + osism apply -r 2 -e custom pull-images 2025-03-23 20:49:10.391438 | orchestrator | 2025-03-23 20:49:10 | INFO  | Trying to run play pull-images in environment custom 2025-03-23 20:49:10.444627 | orchestrator | 2025-03-23 20:49:10 | INFO  | Task 9afae151-19db-43d2-a45a-278eebc9a1ff (pull-images) was prepared for execution. 2025-03-23 20:49:13.567796 | orchestrator | 2025-03-23 20:49:10 | INFO  | It takes a moment until task 9afae151-19db-43d2-a45a-278eebc9a1ff (pull-images) has been started and output is visible here. 2025-03-23 20:49:13.568061 | orchestrator | 2025-03-23 20:49:13.571100 | orchestrator | PLAY [Pull images] ************************************************************* 2025-03-23 20:49:13.571176 | orchestrator | 2025-03-23 20:49:13.571998 | orchestrator | TASK [Pull keystone image] ***************************************************** 2025-03-23 20:49:13.572840 | orchestrator | Sunday 23 March 2025 20:49:13 +0000 (0:00:00.165) 0:00:00.165 ********** 2025-03-23 20:49:54.917446 | orchestrator | changed: [testbed-manager] 2025-03-23 20:50:48.929722 | orchestrator | 2025-03-23 20:50:48.929861 | orchestrator | TASK [Pull other images] ******************************************************* 2025-03-23 20:50:48.929929 | orchestrator | Sunday 23 March 2025 20:49:54 +0000 (0:00:41.351) 0:00:41.517 ********** 2025-03-23 20:50:48.929960 | orchestrator | changed: [testbed-manager] => (item=aodh) 2025-03-23 20:50:48.932567 | orchestrator | changed: [testbed-manager] => (item=barbican) 2025-03-23 20:50:48.932596 | orchestrator | changed: [testbed-manager] => (item=ceilometer) 2025-03-23 20:50:48.934123 | orchestrator | changed: [testbed-manager] => (item=cinder) 2025-03-23 20:50:48.934372 | orchestrator | changed: [testbed-manager] => (item=common) 2025-03-23 20:50:48.935093 | orchestrator | changed: [testbed-manager] => (item=designate) 2025-03-23 20:50:48.935518 | orchestrator | changed: [testbed-manager] => (item=glance) 2025-03-23 20:50:48.936977 | orchestrator | changed: [testbed-manager] => (item=grafana) 2025-03-23 20:50:48.937251 | orchestrator | changed: [testbed-manager] => (item=horizon) 2025-03-23 20:50:48.937273 | orchestrator | changed: [testbed-manager] => (item=ironic) 2025-03-23 20:50:48.937294 | orchestrator | changed: [testbed-manager] => (item=loadbalancer) 2025-03-23 20:50:48.937885 | orchestrator | changed: [testbed-manager] => (item=magnum) 2025-03-23 20:50:48.938686 | orchestrator | changed: [testbed-manager] => (item=mariadb) 2025-03-23 20:50:48.938909 | orchestrator | changed: [testbed-manager] => (item=memcached) 2025-03-23 20:50:48.939498 | orchestrator | changed: [testbed-manager] => (item=neutron) 2025-03-23 20:50:48.940357 | orchestrator | changed: [testbed-manager] => (item=nova) 2025-03-23 20:50:48.940679 | orchestrator | changed: [testbed-manager] => (item=octavia) 2025-03-23 20:50:48.941545 | orchestrator | changed: [testbed-manager] => (item=opensearch) 2025-03-23 20:50:48.941824 | orchestrator | changed: [testbed-manager] => (item=openvswitch) 2025-03-23 20:50:48.942119 | orchestrator | changed: [testbed-manager] => (item=ovn) 2025-03-23 20:50:48.942526 | orchestrator | changed: [testbed-manager] => (item=placement) 2025-03-23 20:50:48.942929 | orchestrator | changed: [testbed-manager] => (item=rabbitmq) 2025-03-23 20:50:48.943196 | orchestrator | changed: [testbed-manager] => (item=redis) 2025-03-23 20:50:48.943911 | orchestrator | changed: [testbed-manager] => (item=skyline) 2025-03-23 20:50:48.944119 | orchestrator | 2025-03-23 20:50:48.944149 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:50:48.944329 | orchestrator | 2025-03-23 20:50:48 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:50:48.944444 | orchestrator | 2025-03-23 20:50:48 | INFO  | Please wait and do not abort execution. 2025-03-23 20:50:48.944827 | orchestrator | testbed-manager : ok=2  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 20:50:48.945146 | orchestrator | 2025-03-23 20:50:48.945434 | orchestrator | Sunday 23 March 2025 20:50:48 +0000 (0:00:54.009) 0:01:35.526 ********** 2025-03-23 20:50:48.945629 | orchestrator | =============================================================================== 2025-03-23 20:50:48.945982 | orchestrator | Pull other images ------------------------------------------------------ 54.01s 2025-03-23 20:50:48.946314 | orchestrator | Pull keystone image ---------------------------------------------------- 41.35s 2025-03-23 20:50:51.334349 | orchestrator | 2025-03-23 20:50:51 | INFO  | Trying to run play wipe-partitions in environment custom 2025-03-23 20:50:51.388171 | orchestrator | 2025-03-23 20:50:51 | INFO  | Task 4eb11f68-39b2-4d7e-87ac-00e261d816ee (wipe-partitions) was prepared for execution. 2025-03-23 20:50:54.840158 | orchestrator | 2025-03-23 20:50:51 | INFO  | It takes a moment until task 4eb11f68-39b2-4d7e-87ac-00e261d816ee (wipe-partitions) has been started and output is visible here. 2025-03-23 20:50:54.840230 | orchestrator | 2025-03-23 20:50:54.840459 | orchestrator | PLAY [Wipe partitions] ********************************************************* 2025-03-23 20:50:54.840911 | orchestrator | 2025-03-23 20:50:54.846846 | orchestrator | TASK [Find all logical devices owned by UID 167] ******************************* 2025-03-23 20:50:54.846933 | orchestrator | Sunday 23 March 2025 20:50:54 +0000 (0:00:00.134) 0:00:00.134 ********** 2025-03-23 20:50:55.544124 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:50:55.544447 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:50:55.544482 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:50:55.545132 | orchestrator | 2025-03-23 20:50:55.547004 | orchestrator | TASK [Remove all rook related logical devices] ********************************* 2025-03-23 20:50:55.547244 | orchestrator | Sunday 23 March 2025 20:50:55 +0000 (0:00:00.701) 0:00:00.836 ********** 2025-03-23 20:50:55.736007 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:50:55.831988 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:50:55.837010 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:50:55.837365 | orchestrator | 2025-03-23 20:50:55.837419 | orchestrator | TASK [Find all logical devices with prefix ceph] ******************************* 2025-03-23 20:50:55.837838 | orchestrator | Sunday 23 March 2025 20:50:55 +0000 (0:00:00.293) 0:00:01.129 ********** 2025-03-23 20:50:56.647940 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:50:56.648364 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:50:56.649254 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:50:56.649696 | orchestrator | 2025-03-23 20:50:56.650366 | orchestrator | TASK [Remove all ceph related logical devices] ********************************* 2025-03-23 20:50:56.650818 | orchestrator | Sunday 23 March 2025 20:50:56 +0000 (0:00:00.813) 0:00:01.943 ********** 2025-03-23 20:50:56.839437 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:50:56.961741 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:50:56.962587 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:50:56.962667 | orchestrator | 2025-03-23 20:50:56.964161 | orchestrator | TASK [Check device availability] *********************************************** 2025-03-23 20:50:56.967986 | orchestrator | Sunday 23 March 2025 20:50:56 +0000 (0:00:00.315) 0:00:02.258 ********** 2025-03-23 20:50:58.239942 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2025-03-23 20:50:58.244329 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2025-03-23 20:50:58.245121 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2025-03-23 20:50:58.245174 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2025-03-23 20:50:58.247223 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2025-03-23 20:50:58.248908 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2025-03-23 20:50:58.249957 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2025-03-23 20:50:58.251354 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2025-03-23 20:50:58.251690 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2025-03-23 20:50:58.252380 | orchestrator | 2025-03-23 20:50:58.252815 | orchestrator | TASK [Wipe partitions with wipefs] ********************************************* 2025-03-23 20:50:58.253245 | orchestrator | Sunday 23 March 2025 20:50:58 +0000 (0:00:01.277) 0:00:03.536 ********** 2025-03-23 20:50:59.675041 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdb) 2025-03-23 20:50:59.675746 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdb) 2025-03-23 20:50:59.676549 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdb) 2025-03-23 20:50:59.679681 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdc) 2025-03-23 20:50:59.680008 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdc) 2025-03-23 20:50:59.680565 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdc) 2025-03-23 20:50:59.681296 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdd) 2025-03-23 20:50:59.682263 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdd) 2025-03-23 20:50:59.683233 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdd) 2025-03-23 20:50:59.684006 | orchestrator | 2025-03-23 20:50:59.686700 | orchestrator | TASK [Overwrite first 32M with zeros] ****************************************** 2025-03-23 20:50:59.687056 | orchestrator | Sunday 23 March 2025 20:50:59 +0000 (0:00:01.431) 0:00:04.968 ********** 2025-03-23 20:51:02.247237 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2025-03-23 20:51:02.250462 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2025-03-23 20:51:02.251444 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2025-03-23 20:51:02.251869 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2025-03-23 20:51:02.252447 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2025-03-23 20:51:02.252808 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2025-03-23 20:51:02.253238 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2025-03-23 20:51:02.253652 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2025-03-23 20:51:02.254187 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2025-03-23 20:51:02.254929 | orchestrator | 2025-03-23 20:51:02.255247 | orchestrator | TASK [Reload udev rules] ******************************************************* 2025-03-23 20:51:02.255280 | orchestrator | Sunday 23 March 2025 20:51:02 +0000 (0:00:02.573) 0:00:07.541 ********** 2025-03-23 20:51:02.908985 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:51:02.909123 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:51:02.909590 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:51:02.910075 | orchestrator | 2025-03-23 20:51:02.910572 | orchestrator | TASK [Request device events from the kernel] *********************************** 2025-03-23 20:51:02.912004 | orchestrator | Sunday 23 March 2025 20:51:02 +0000 (0:00:00.664) 0:00:08.206 ********** 2025-03-23 20:51:03.643293 | orchestrator | changed: [testbed-node-3] 2025-03-23 20:51:03.643452 | orchestrator | changed: [testbed-node-4] 2025-03-23 20:51:03.644062 | orchestrator | changed: [testbed-node-5] 2025-03-23 20:51:03.644771 | orchestrator | 2025-03-23 20:51:03.645973 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:51:03.646086 | orchestrator | 2025-03-23 20:51:03 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:51:03.646113 | orchestrator | 2025-03-23 20:51:03 | INFO  | Please wait and do not abort execution. 2025-03-23 20:51:03.646920 | orchestrator | testbed-node-3 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:51:03.647419 | orchestrator | testbed-node-4 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:51:03.648901 | orchestrator | testbed-node-5 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:51:03.649262 | orchestrator | 2025-03-23 20:51:03.649532 | orchestrator | Sunday 23 March 2025 20:51:03 +0000 (0:00:00.735) 0:00:08.941 ********** 2025-03-23 20:51:03.650391 | orchestrator | =============================================================================== 2025-03-23 20:51:03.650724 | orchestrator | Overwrite first 32M with zeros ------------------------------------------ 2.57s 2025-03-23 20:51:03.650943 | orchestrator | Wipe partitions with wipefs --------------------------------------------- 1.43s 2025-03-23 20:51:03.654146 | orchestrator | Check device availability ----------------------------------------------- 1.28s 2025-03-23 20:51:05.975988 | orchestrator | Find all logical devices with prefix ceph ------------------------------- 0.81s 2025-03-23 20:51:05.976086 | orchestrator | Request device events from the kernel ----------------------------------- 0.74s 2025-03-23 20:51:05.976105 | orchestrator | Find all logical devices owned by UID 167 ------------------------------- 0.70s 2025-03-23 20:51:05.976119 | orchestrator | Reload udev rules ------------------------------------------------------- 0.66s 2025-03-23 20:51:05.976133 | orchestrator | Remove all ceph related logical devices --------------------------------- 0.32s 2025-03-23 20:51:05.976148 | orchestrator | Remove all rook related logical devices --------------------------------- 0.29s 2025-03-23 20:51:05.976176 | orchestrator | 2025-03-23 20:51:05 | INFO  | Task 94377220-db43-491e-bc4a-95ef5930d429 (facts) was prepared for execution. 2025-03-23 20:51:10.749148 | orchestrator | 2025-03-23 20:51:05 | INFO  | It takes a moment until task 94377220-db43-491e-bc4a-95ef5930d429 (facts) has been started and output is visible here. 2025-03-23 20:51:10.749266 | orchestrator | 2025-03-23 20:51:10.749661 | orchestrator | PLAY [Apply role facts] ******************************************************** 2025-03-23 20:51:10.751594 | orchestrator | 2025-03-23 20:51:10.756774 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-03-23 20:51:10.756808 | orchestrator | Sunday 23 March 2025 20:51:10 +0000 (0:00:00.260) 0:00:00.260 ********** 2025-03-23 20:51:11.991600 | orchestrator | ok: [testbed-manager] 2025-03-23 20:51:11.992718 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:51:11.993885 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:51:11.995689 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:51:11.996509 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:51:11.997007 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:51:11.999159 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:51:11.999968 | orchestrator | 2025-03-23 20:51:12.000946 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-03-23 20:51:12.001646 | orchestrator | Sunday 23 March 2025 20:51:11 +0000 (0:00:01.239) 0:00:01.500 ********** 2025-03-23 20:51:12.236804 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:51:12.353169 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:51:12.454737 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:51:12.559223 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:51:12.669208 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:13.909020 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:13.909466 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:51:13.910968 | orchestrator | 2025-03-23 20:51:13.918349 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-03-23 20:51:13.922123 | orchestrator | 2025-03-23 20:51:13.923707 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-23 20:51:13.925853 | orchestrator | Sunday 23 March 2025 20:51:13 +0000 (0:00:01.925) 0:00:03.425 ********** 2025-03-23 20:51:18.778831 | orchestrator | ok: [testbed-node-1] 2025-03-23 20:51:18.779561 | orchestrator | ok: [testbed-node-2] 2025-03-23 20:51:18.782162 | orchestrator | ok: [testbed-node-0] 2025-03-23 20:51:18.782553 | orchestrator | ok: [testbed-manager] 2025-03-23 20:51:18.783785 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:51:18.784818 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:51:18.785383 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:51:18.786673 | orchestrator | 2025-03-23 20:51:18.787601 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-03-23 20:51:18.788815 | orchestrator | 2025-03-23 20:51:18.790496 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-03-23 20:51:18.791478 | orchestrator | Sunday 23 March 2025 20:51:18 +0000 (0:00:04.873) 0:00:08.299 ********** 2025-03-23 20:51:19.046481 | orchestrator | skipping: [testbed-manager] 2025-03-23 20:51:19.114373 | orchestrator | skipping: [testbed-node-0] 2025-03-23 20:51:19.186933 | orchestrator | skipping: [testbed-node-1] 2025-03-23 20:51:19.265537 | orchestrator | skipping: [testbed-node-2] 2025-03-23 20:51:19.350202 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:19.385732 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:19.385890 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:51:19.386545 | orchestrator | 2025-03-23 20:51:19.387775 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:51:19.388400 | orchestrator | 2025-03-23 20:51:19 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:51:19.390687 | orchestrator | 2025-03-23 20:51:19 | INFO  | Please wait and do not abort execution. 2025-03-23 20:51:19.390721 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:51:19.392800 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:51:19.393452 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:51:19.394068 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:51:19.395320 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:51:19.396865 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:51:19.397298 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 20:51:19.397829 | orchestrator | 2025-03-23 20:51:19.398133 | orchestrator | Sunday 23 March 2025 20:51:19 +0000 (0:00:00.607) 0:00:08.906 ********** 2025-03-23 20:51:19.398704 | orchestrator | =============================================================================== 2025-03-23 20:51:19.399299 | orchestrator | Gathers facts about hosts ----------------------------------------------- 4.87s 2025-03-23 20:51:19.399433 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.93s 2025-03-23 20:51:19.399974 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.24s 2025-03-23 20:51:19.400204 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.61s 2025-03-23 20:51:21.326932 | orchestrator | 2025-03-23 20:51:21 | INFO  | Task 1d71b2ff-433a-42f1-8792-0771a9900e43 (ceph-configure-lvm-volumes) was prepared for execution. 2025-03-23 20:51:25.913589 | orchestrator | 2025-03-23 20:51:21 | INFO  | It takes a moment until task 1d71b2ff-433a-42f1-8792-0771a9900e43 (ceph-configure-lvm-volumes) has been started and output is visible here. 2025-03-23 20:51:25.913760 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.12 2025-03-23 20:51:26.549768 | orchestrator | 2025-03-23 20:51:26.552245 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-03-23 20:51:26.553121 | orchestrator | 2025-03-23 20:51:26.554047 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-23 20:51:26.557357 | orchestrator | Sunday 23 March 2025 20:51:26 +0000 (0:00:00.523) 0:00:00.523 ********** 2025-03-23 20:51:26.850519 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-03-23 20:51:26.854154 | orchestrator | 2025-03-23 20:51:26.854887 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-23 20:51:26.855943 | orchestrator | Sunday 23 March 2025 20:51:26 +0000 (0:00:00.303) 0:00:00.826 ********** 2025-03-23 20:51:27.122092 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:51:27.123745 | orchestrator | 2025-03-23 20:51:27.124656 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:27.125475 | orchestrator | Sunday 23 March 2025 20:51:27 +0000 (0:00:00.270) 0:00:01.096 ********** 2025-03-23 20:51:27.836871 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2025-03-23 20:51:27.837073 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2025-03-23 20:51:27.837483 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2025-03-23 20:51:27.840326 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2025-03-23 20:51:27.840731 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2025-03-23 20:51:27.840756 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2025-03-23 20:51:27.840770 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2025-03-23 20:51:27.840790 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2025-03-23 20:51:27.841343 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2025-03-23 20:51:27.844358 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2025-03-23 20:51:27.844867 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2025-03-23 20:51:27.844892 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2025-03-23 20:51:27.844907 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2025-03-23 20:51:27.844926 | orchestrator | 2025-03-23 20:51:27.845889 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:27.846181 | orchestrator | Sunday 23 March 2025 20:51:27 +0000 (0:00:00.718) 0:00:01.815 ********** 2025-03-23 20:51:28.062444 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:28.064081 | orchestrator | 2025-03-23 20:51:28.067048 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:28.345683 | orchestrator | Sunday 23 March 2025 20:51:28 +0000 (0:00:00.225) 0:00:02.040 ********** 2025-03-23 20:51:28.345786 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:28.347866 | orchestrator | 2025-03-23 20:51:28.579525 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:28.579588 | orchestrator | Sunday 23 March 2025 20:51:28 +0000 (0:00:00.279) 0:00:02.320 ********** 2025-03-23 20:51:28.579655 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:28.580395 | orchestrator | 2025-03-23 20:51:28.580421 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:28.580442 | orchestrator | Sunday 23 March 2025 20:51:28 +0000 (0:00:00.233) 0:00:02.553 ********** 2025-03-23 20:51:28.797860 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:28.799362 | orchestrator | 2025-03-23 20:51:28.800350 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:28.803216 | orchestrator | Sunday 23 March 2025 20:51:28 +0000 (0:00:00.220) 0:00:02.774 ********** 2025-03-23 20:51:29.052992 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:29.053555 | orchestrator | 2025-03-23 20:51:29.053824 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:29.054790 | orchestrator | Sunday 23 March 2025 20:51:29 +0000 (0:00:00.257) 0:00:03.031 ********** 2025-03-23 20:51:29.354145 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:29.354406 | orchestrator | 2025-03-23 20:51:29.354731 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:29.354962 | orchestrator | Sunday 23 March 2025 20:51:29 +0000 (0:00:00.298) 0:00:03.329 ********** 2025-03-23 20:51:29.600299 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:29.602145 | orchestrator | 2025-03-23 20:51:29.606245 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:29.608805 | orchestrator | Sunday 23 March 2025 20:51:29 +0000 (0:00:00.249) 0:00:03.579 ********** 2025-03-23 20:51:29.837896 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:29.838202 | orchestrator | 2025-03-23 20:51:29.841736 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:29.842259 | orchestrator | Sunday 23 March 2025 20:51:29 +0000 (0:00:00.235) 0:00:03.814 ********** 2025-03-23 20:51:30.662253 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_dcc46cc2-9048-4c81-bc2f-465e491970df) 2025-03-23 20:51:30.665332 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_dcc46cc2-9048-4c81-bc2f-465e491970df) 2025-03-23 20:51:30.665596 | orchestrator | 2025-03-23 20:51:30.665675 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:30.666072 | orchestrator | Sunday 23 March 2025 20:51:30 +0000 (0:00:00.823) 0:00:04.638 ********** 2025-03-23 20:51:32.044243 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_ed5da3c7-1374-4bfc-b341-605cae6b6ed5) 2025-03-23 20:51:32.045585 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_ed5da3c7-1374-4bfc-b341-605cae6b6ed5) 2025-03-23 20:51:32.047227 | orchestrator | 2025-03-23 20:51:32.047529 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:32.047881 | orchestrator | Sunday 23 March 2025 20:51:32 +0000 (0:00:01.382) 0:00:06.020 ********** 2025-03-23 20:51:32.818261 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_ce372efe-ce53-40a7-9ab4-8764278391af) 2025-03-23 20:51:32.823507 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_ce372efe-ce53-40a7-9ab4-8764278391af) 2025-03-23 20:51:32.823718 | orchestrator | 2025-03-23 20:51:32.823751 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:32.823833 | orchestrator | Sunday 23 March 2025 20:51:32 +0000 (0:00:00.776) 0:00:06.797 ********** 2025-03-23 20:51:33.341223 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_456a620e-4d8a-4da0-8fad-68a9dae98a07) 2025-03-23 20:51:33.344950 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_456a620e-4d8a-4da0-8fad-68a9dae98a07) 2025-03-23 20:51:33.347504 | orchestrator | 2025-03-23 20:51:33.347852 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:33.348879 | orchestrator | Sunday 23 March 2025 20:51:33 +0000 (0:00:00.521) 0:00:07.318 ********** 2025-03-23 20:51:33.996957 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-23 20:51:33.999313 | orchestrator | 2025-03-23 20:51:33.999776 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:33.999816 | orchestrator | Sunday 23 March 2025 20:51:33 +0000 (0:00:00.655) 0:00:07.974 ********** 2025-03-23 20:51:34.534342 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2025-03-23 20:51:34.536740 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2025-03-23 20:51:34.538269 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2025-03-23 20:51:34.540274 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2025-03-23 20:51:34.540310 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2025-03-23 20:51:34.543423 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2025-03-23 20:51:34.547722 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2025-03-23 20:51:34.548074 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2025-03-23 20:51:34.548891 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2025-03-23 20:51:34.552713 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2025-03-23 20:51:34.554390 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2025-03-23 20:51:34.555033 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2025-03-23 20:51:34.555915 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2025-03-23 20:51:34.556901 | orchestrator | 2025-03-23 20:51:34.557501 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:34.557986 | orchestrator | Sunday 23 March 2025 20:51:34 +0000 (0:00:00.532) 0:00:08.506 ********** 2025-03-23 20:51:34.859662 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:34.860513 | orchestrator | 2025-03-23 20:51:34.861741 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:34.863476 | orchestrator | Sunday 23 March 2025 20:51:34 +0000 (0:00:00.326) 0:00:08.833 ********** 2025-03-23 20:51:35.081267 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:35.083666 | orchestrator | 2025-03-23 20:51:35.083792 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:35.083820 | orchestrator | Sunday 23 March 2025 20:51:35 +0000 (0:00:00.225) 0:00:09.058 ********** 2025-03-23 20:51:35.376705 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:35.376856 | orchestrator | 2025-03-23 20:51:35.377444 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:35.378710 | orchestrator | Sunday 23 March 2025 20:51:35 +0000 (0:00:00.295) 0:00:09.353 ********** 2025-03-23 20:51:35.676701 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:35.678993 | orchestrator | 2025-03-23 20:51:35.679047 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:35.679555 | orchestrator | Sunday 23 March 2025 20:51:35 +0000 (0:00:00.299) 0:00:09.653 ********** 2025-03-23 20:51:36.314537 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:36.316530 | orchestrator | 2025-03-23 20:51:36.317475 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:36.317993 | orchestrator | Sunday 23 March 2025 20:51:36 +0000 (0:00:00.640) 0:00:10.293 ********** 2025-03-23 20:51:36.591307 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:36.592140 | orchestrator | 2025-03-23 20:51:36.592388 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:36.592806 | orchestrator | Sunday 23 March 2025 20:51:36 +0000 (0:00:00.269) 0:00:10.562 ********** 2025-03-23 20:51:36.830903 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:36.831802 | orchestrator | 2025-03-23 20:51:36.834712 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:36.836082 | orchestrator | Sunday 23 March 2025 20:51:36 +0000 (0:00:00.246) 0:00:10.809 ********** 2025-03-23 20:51:37.048802 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:37.048947 | orchestrator | 2025-03-23 20:51:37.049120 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:37.049513 | orchestrator | Sunday 23 March 2025 20:51:37 +0000 (0:00:00.217) 0:00:11.027 ********** 2025-03-23 20:51:37.869075 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2025-03-23 20:51:37.874169 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2025-03-23 20:51:37.874882 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2025-03-23 20:51:37.876729 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2025-03-23 20:51:37.876959 | orchestrator | 2025-03-23 20:51:37.878224 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:37.879322 | orchestrator | Sunday 23 March 2025 20:51:37 +0000 (0:00:00.816) 0:00:11.844 ********** 2025-03-23 20:51:38.100784 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:38.101739 | orchestrator | 2025-03-23 20:51:38.102264 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:38.105173 | orchestrator | Sunday 23 March 2025 20:51:38 +0000 (0:00:00.235) 0:00:12.079 ********** 2025-03-23 20:51:38.349432 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:38.352892 | orchestrator | 2025-03-23 20:51:38.355712 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:38.355779 | orchestrator | Sunday 23 March 2025 20:51:38 +0000 (0:00:00.247) 0:00:12.327 ********** 2025-03-23 20:51:38.601967 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:38.603753 | orchestrator | 2025-03-23 20:51:38.604002 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:38.609800 | orchestrator | Sunday 23 March 2025 20:51:38 +0000 (0:00:00.250) 0:00:12.577 ********** 2025-03-23 20:51:38.838879 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:38.839580 | orchestrator | 2025-03-23 20:51:38.842336 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-03-23 20:51:38.842718 | orchestrator | Sunday 23 March 2025 20:51:38 +0000 (0:00:00.240) 0:00:12.818 ********** 2025-03-23 20:51:39.018156 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': None}) 2025-03-23 20:51:39.019263 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': None}) 2025-03-23 20:51:39.021281 | orchestrator | 2025-03-23 20:51:39.023291 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-03-23 20:51:39.023760 | orchestrator | Sunday 23 March 2025 20:51:39 +0000 (0:00:00.178) 0:00:12.996 ********** 2025-03-23 20:51:39.323201 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:39.325079 | orchestrator | 2025-03-23 20:51:39.330075 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-03-23 20:51:39.332521 | orchestrator | Sunday 23 March 2025 20:51:39 +0000 (0:00:00.305) 0:00:13.302 ********** 2025-03-23 20:51:39.474186 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:39.474722 | orchestrator | 2025-03-23 20:51:39.475091 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-03-23 20:51:39.475446 | orchestrator | Sunday 23 March 2025 20:51:39 +0000 (0:00:00.148) 0:00:13.450 ********** 2025-03-23 20:51:39.610421 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:39.613521 | orchestrator | 2025-03-23 20:51:39.613565 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-03-23 20:51:39.810731 | orchestrator | Sunday 23 March 2025 20:51:39 +0000 (0:00:00.136) 0:00:13.586 ********** 2025-03-23 20:51:39.810843 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:51:39.810913 | orchestrator | 2025-03-23 20:51:39.812884 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-03-23 20:51:39.813235 | orchestrator | Sunday 23 March 2025 20:51:39 +0000 (0:00:00.203) 0:00:13.790 ********** 2025-03-23 20:51:40.072432 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'a8bbe70e-ef9b-5e78-a477-05274116adef'}}) 2025-03-23 20:51:40.073851 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '7ba69cd0-48cc-55e3-9649-a43d5cfb8428'}}) 2025-03-23 20:51:40.075413 | orchestrator | 2025-03-23 20:51:40.075785 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-03-23 20:51:40.076199 | orchestrator | Sunday 23 March 2025 20:51:40 +0000 (0:00:00.262) 0:00:14.052 ********** 2025-03-23 20:51:40.232401 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'a8bbe70e-ef9b-5e78-a477-05274116adef'}})  2025-03-23 20:51:40.233392 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '7ba69cd0-48cc-55e3-9649-a43d5cfb8428'}})  2025-03-23 20:51:40.233846 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:40.236266 | orchestrator | 2025-03-23 20:51:40.382712 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-03-23 20:51:40.382777 | orchestrator | Sunday 23 March 2025 20:51:40 +0000 (0:00:00.159) 0:00:14.211 ********** 2025-03-23 20:51:40.382801 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'a8bbe70e-ef9b-5e78-a477-05274116adef'}})  2025-03-23 20:51:40.383020 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '7ba69cd0-48cc-55e3-9649-a43d5cfb8428'}})  2025-03-23 20:51:40.383521 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:40.383893 | orchestrator | 2025-03-23 20:51:40.384294 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-03-23 20:51:40.385342 | orchestrator | Sunday 23 March 2025 20:51:40 +0000 (0:00:00.150) 0:00:14.361 ********** 2025-03-23 20:51:40.584770 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'a8bbe70e-ef9b-5e78-a477-05274116adef'}})  2025-03-23 20:51:40.585608 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '7ba69cd0-48cc-55e3-9649-a43d5cfb8428'}})  2025-03-23 20:51:40.587316 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:40.590895 | orchestrator | 2025-03-23 20:51:40.594303 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-03-23 20:51:40.594579 | orchestrator | Sunday 23 March 2025 20:51:40 +0000 (0:00:00.199) 0:00:14.561 ********** 2025-03-23 20:51:40.717604 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:51:40.719314 | orchestrator | 2025-03-23 20:51:40.720590 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-03-23 20:51:40.721466 | orchestrator | Sunday 23 March 2025 20:51:40 +0000 (0:00:00.135) 0:00:14.696 ********** 2025-03-23 20:51:40.866967 | orchestrator | ok: [testbed-node-3] 2025-03-23 20:51:40.868297 | orchestrator | 2025-03-23 20:51:40.868330 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-03-23 20:51:40.869842 | orchestrator | Sunday 23 March 2025 20:51:40 +0000 (0:00:00.149) 0:00:14.845 ********** 2025-03-23 20:51:41.003523 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:41.008149 | orchestrator | 2025-03-23 20:51:41.130885 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-03-23 20:51:41.130982 | orchestrator | Sunday 23 March 2025 20:51:40 +0000 (0:00:00.136) 0:00:14.981 ********** 2025-03-23 20:51:41.131014 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:41.131317 | orchestrator | 2025-03-23 20:51:41.132401 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-03-23 20:51:41.135554 | orchestrator | Sunday 23 March 2025 20:51:41 +0000 (0:00:00.129) 0:00:15.111 ********** 2025-03-23 20:51:41.401191 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:41.402075 | orchestrator | 2025-03-23 20:51:41.405246 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-03-23 20:51:41.405544 | orchestrator | Sunday 23 March 2025 20:51:41 +0000 (0:00:00.267) 0:00:15.378 ********** 2025-03-23 20:51:41.590533 | orchestrator | ok: [testbed-node-3] => { 2025-03-23 20:51:41.591106 | orchestrator |  "ceph_osd_devices": { 2025-03-23 20:51:41.591140 | orchestrator |  "sdb": { 2025-03-23 20:51:41.592214 | orchestrator |  "osd_lvm_uuid": "a8bbe70e-ef9b-5e78-a477-05274116adef" 2025-03-23 20:51:41.593152 | orchestrator |  }, 2025-03-23 20:51:41.593890 | orchestrator |  "sdc": { 2025-03-23 20:51:41.595553 | orchestrator |  "osd_lvm_uuid": "7ba69cd0-48cc-55e3-9649-a43d5cfb8428" 2025-03-23 20:51:41.596285 | orchestrator |  } 2025-03-23 20:51:41.597039 | orchestrator |  } 2025-03-23 20:51:41.598125 | orchestrator | } 2025-03-23 20:51:41.598741 | orchestrator | 2025-03-23 20:51:41.599880 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-03-23 20:51:41.600382 | orchestrator | Sunday 23 March 2025 20:51:41 +0000 (0:00:00.187) 0:00:15.566 ********** 2025-03-23 20:51:41.701408 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:41.703246 | orchestrator | 2025-03-23 20:51:41.703416 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-03-23 20:51:41.703447 | orchestrator | Sunday 23 March 2025 20:51:41 +0000 (0:00:00.113) 0:00:15.680 ********** 2025-03-23 20:51:41.848659 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:41.848793 | orchestrator | 2025-03-23 20:51:41.867372 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-03-23 20:51:42.031903 | orchestrator | Sunday 23 March 2025 20:51:41 +0000 (0:00:00.144) 0:00:15.824 ********** 2025-03-23 20:51:42.031967 | orchestrator | skipping: [testbed-node-3] 2025-03-23 20:51:42.032918 | orchestrator | 2025-03-23 20:51:42.034674 | orchestrator | TASK [Print configuration data] ************************************************ 2025-03-23 20:51:42.037206 | orchestrator | Sunday 23 March 2025 20:51:42 +0000 (0:00:00.184) 0:00:16.008 ********** 2025-03-23 20:51:42.421003 | orchestrator | changed: [testbed-node-3] => { 2025-03-23 20:51:42.430060 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-03-23 20:51:42.430304 | orchestrator |  "ceph_osd_devices": { 2025-03-23 20:51:42.430328 | orchestrator |  "sdb": { 2025-03-23 20:51:42.430344 | orchestrator |  "osd_lvm_uuid": "a8bbe70e-ef9b-5e78-a477-05274116adef" 2025-03-23 20:51:42.430362 | orchestrator |  }, 2025-03-23 20:51:42.430378 | orchestrator |  "sdc": { 2025-03-23 20:51:42.430397 | orchestrator |  "osd_lvm_uuid": "7ba69cd0-48cc-55e3-9649-a43d5cfb8428" 2025-03-23 20:51:42.431205 | orchestrator |  } 2025-03-23 20:51:42.432251 | orchestrator |  }, 2025-03-23 20:51:42.432720 | orchestrator |  "lvm_volumes": [ 2025-03-23 20:51:42.433538 | orchestrator |  { 2025-03-23 20:51:42.434538 | orchestrator |  "data": "osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef", 2025-03-23 20:51:42.435453 | orchestrator |  "data_vg": "ceph-a8bbe70e-ef9b-5e78-a477-05274116adef" 2025-03-23 20:51:42.435933 | orchestrator |  }, 2025-03-23 20:51:42.436654 | orchestrator |  { 2025-03-23 20:51:42.437309 | orchestrator |  "data": "osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428", 2025-03-23 20:51:42.437813 | orchestrator |  "data_vg": "ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428" 2025-03-23 20:51:42.438551 | orchestrator |  } 2025-03-23 20:51:42.439141 | orchestrator |  ] 2025-03-23 20:51:42.439867 | orchestrator |  } 2025-03-23 20:51:42.440720 | orchestrator | } 2025-03-23 20:51:42.441099 | orchestrator | 2025-03-23 20:51:42.441700 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-03-23 20:51:42.442335 | orchestrator | Sunday 23 March 2025 20:51:42 +0000 (0:00:00.388) 0:00:16.397 ********** 2025-03-23 20:51:45.005008 | orchestrator | changed: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-03-23 20:51:45.013833 | orchestrator | 2025-03-23 20:51:45.013868 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-03-23 20:51:45.013891 | orchestrator | 2025-03-23 20:51:45.019252 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-23 20:51:45.398459 | orchestrator | Sunday 23 March 2025 20:51:44 +0000 (0:00:02.580) 0:00:18.978 ********** 2025-03-23 20:51:45.398545 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-03-23 20:51:45.400392 | orchestrator | 2025-03-23 20:51:45.400960 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-23 20:51:45.403839 | orchestrator | Sunday 23 March 2025 20:51:45 +0000 (0:00:00.394) 0:00:19.372 ********** 2025-03-23 20:51:45.671841 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:51:45.674365 | orchestrator | 2025-03-23 20:51:46.204095 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:46.204199 | orchestrator | Sunday 23 March 2025 20:51:45 +0000 (0:00:00.273) 0:00:19.645 ********** 2025-03-23 20:51:46.204232 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2025-03-23 20:51:46.206296 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2025-03-23 20:51:46.209482 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2025-03-23 20:51:46.209751 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2025-03-23 20:51:46.209777 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2025-03-23 20:51:46.211931 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2025-03-23 20:51:46.212543 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2025-03-23 20:51:46.218673 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2025-03-23 20:51:46.220750 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2025-03-23 20:51:46.220773 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2025-03-23 20:51:46.220788 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2025-03-23 20:51:46.220807 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2025-03-23 20:51:46.461122 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2025-03-23 20:51:46.461207 | orchestrator | 2025-03-23 20:51:46.461223 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:46.461240 | orchestrator | Sunday 23 March 2025 20:51:46 +0000 (0:00:00.535) 0:00:20.181 ********** 2025-03-23 20:51:46.461265 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:46.462221 | orchestrator | 2025-03-23 20:51:46.463184 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:46.463819 | orchestrator | Sunday 23 March 2025 20:51:46 +0000 (0:00:00.257) 0:00:20.438 ********** 2025-03-23 20:51:46.753259 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:46.756052 | orchestrator | 2025-03-23 20:51:46.756337 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:46.759476 | orchestrator | Sunday 23 March 2025 20:51:46 +0000 (0:00:00.289) 0:00:20.728 ********** 2025-03-23 20:51:47.535783 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:47.536911 | orchestrator | 2025-03-23 20:51:47.537197 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:47.538943 | orchestrator | Sunday 23 March 2025 20:51:47 +0000 (0:00:00.783) 0:00:21.512 ********** 2025-03-23 20:51:47.795007 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:47.796166 | orchestrator | 2025-03-23 20:51:47.796896 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:47.799283 | orchestrator | Sunday 23 March 2025 20:51:47 +0000 (0:00:00.258) 0:00:21.770 ********** 2025-03-23 20:51:48.114977 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:48.115441 | orchestrator | 2025-03-23 20:51:48.116785 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:48.118354 | orchestrator | Sunday 23 March 2025 20:51:48 +0000 (0:00:00.321) 0:00:22.092 ********** 2025-03-23 20:51:48.401208 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:48.401787 | orchestrator | 2025-03-23 20:51:48.402388 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:48.402955 | orchestrator | Sunday 23 March 2025 20:51:48 +0000 (0:00:00.287) 0:00:22.379 ********** 2025-03-23 20:51:48.658410 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:48.659101 | orchestrator | 2025-03-23 20:51:48.660481 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:48.901385 | orchestrator | Sunday 23 March 2025 20:51:48 +0000 (0:00:00.254) 0:00:22.634 ********** 2025-03-23 20:51:48.901472 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:48.901525 | orchestrator | 2025-03-23 20:51:48.901542 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:48.901559 | orchestrator | Sunday 23 March 2025 20:51:48 +0000 (0:00:00.243) 0:00:22.877 ********** 2025-03-23 20:51:49.355215 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_4f3230c7-420c-46b1-a455-5c4b6f068dba) 2025-03-23 20:51:49.355351 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_4f3230c7-420c-46b1-a455-5c4b6f068dba) 2025-03-23 20:51:49.355373 | orchestrator | 2025-03-23 20:51:49.357996 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:49.360036 | orchestrator | Sunday 23 March 2025 20:51:49 +0000 (0:00:00.455) 0:00:23.333 ********** 2025-03-23 20:51:49.801922 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_ed33c331-9d6a-419f-a2ff-c44c346902af) 2025-03-23 20:51:49.803941 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_ed33c331-9d6a-419f-a2ff-c44c346902af) 2025-03-23 20:51:49.803988 | orchestrator | 2025-03-23 20:51:49.804008 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:50.233230 | orchestrator | Sunday 23 March 2025 20:51:49 +0000 (0:00:00.442) 0:00:23.776 ********** 2025-03-23 20:51:50.233344 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_50c8fc43-f5ea-4ff6-9b1c-3a82fc0b92db) 2025-03-23 20:51:50.234305 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_50c8fc43-f5ea-4ff6-9b1c-3a82fc0b92db) 2025-03-23 20:51:50.234995 | orchestrator | 2025-03-23 20:51:50.236801 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:50.240151 | orchestrator | Sunday 23 March 2025 20:51:50 +0000 (0:00:00.434) 0:00:24.210 ********** 2025-03-23 20:51:50.998006 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_f00ea6e7-d866-4241-9ba8-a6f4135a6384) 2025-03-23 20:51:50.998251 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_f00ea6e7-d866-4241-9ba8-a6f4135a6384) 2025-03-23 20:51:50.998276 | orchestrator | 2025-03-23 20:51:50.998800 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:51:50.998827 | orchestrator | Sunday 23 March 2025 20:51:50 +0000 (0:00:00.765) 0:00:24.976 ********** 2025-03-23 20:51:51.838881 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-23 20:51:51.842476 | orchestrator | 2025-03-23 20:51:51.842516 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:52.281885 | orchestrator | Sunday 23 March 2025 20:51:51 +0000 (0:00:00.836) 0:00:25.812 ********** 2025-03-23 20:51:52.282000 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2025-03-23 20:51:52.282881 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2025-03-23 20:51:52.286592 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2025-03-23 20:51:52.286849 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2025-03-23 20:51:52.286876 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2025-03-23 20:51:52.288648 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2025-03-23 20:51:52.289910 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2025-03-23 20:51:52.291324 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2025-03-23 20:51:52.292141 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2025-03-23 20:51:52.293071 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2025-03-23 20:51:52.293382 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2025-03-23 20:51:52.294060 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2025-03-23 20:51:52.294542 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2025-03-23 20:51:52.294998 | orchestrator | 2025-03-23 20:51:52.295478 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:52.295891 | orchestrator | Sunday 23 March 2025 20:51:52 +0000 (0:00:00.446) 0:00:26.259 ********** 2025-03-23 20:51:52.498083 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:52.498770 | orchestrator | 2025-03-23 20:51:52.498811 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:52.499731 | orchestrator | Sunday 23 March 2025 20:51:52 +0000 (0:00:00.214) 0:00:26.474 ********** 2025-03-23 20:51:52.707765 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:52.708670 | orchestrator | 2025-03-23 20:51:52.708990 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:52.709875 | orchestrator | Sunday 23 March 2025 20:51:52 +0000 (0:00:00.210) 0:00:26.684 ********** 2025-03-23 20:51:52.947191 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:52.947552 | orchestrator | 2025-03-23 20:51:52.949411 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:52.949669 | orchestrator | Sunday 23 March 2025 20:51:52 +0000 (0:00:00.239) 0:00:26.924 ********** 2025-03-23 20:51:53.176081 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:53.177422 | orchestrator | 2025-03-23 20:51:53.177560 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:53.178587 | orchestrator | Sunday 23 March 2025 20:51:53 +0000 (0:00:00.226) 0:00:27.150 ********** 2025-03-23 20:51:53.387890 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:53.387999 | orchestrator | 2025-03-23 20:51:53.388852 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:53.390079 | orchestrator | Sunday 23 March 2025 20:51:53 +0000 (0:00:00.214) 0:00:27.365 ********** 2025-03-23 20:51:53.586402 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:53.587002 | orchestrator | 2025-03-23 20:51:53.588991 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:53.589020 | orchestrator | Sunday 23 March 2025 20:51:53 +0000 (0:00:00.196) 0:00:27.562 ********** 2025-03-23 20:51:53.810186 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:53.810852 | orchestrator | 2025-03-23 20:51:53.811503 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:53.812210 | orchestrator | Sunday 23 March 2025 20:51:53 +0000 (0:00:00.224) 0:00:27.786 ********** 2025-03-23 20:51:54.050959 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:54.051481 | orchestrator | 2025-03-23 20:51:54.052408 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:54.053493 | orchestrator | Sunday 23 March 2025 20:51:54 +0000 (0:00:00.241) 0:00:28.027 ********** 2025-03-23 20:51:55.003853 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2025-03-23 20:51:55.005455 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2025-03-23 20:51:55.006094 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2025-03-23 20:51:55.007556 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2025-03-23 20:51:55.008039 | orchestrator | 2025-03-23 20:51:55.008838 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:55.009118 | orchestrator | Sunday 23 March 2025 20:51:54 +0000 (0:00:00.950) 0:00:28.978 ********** 2025-03-23 20:51:55.269607 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:55.269992 | orchestrator | 2025-03-23 20:51:55.270076 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:55.270810 | orchestrator | Sunday 23 March 2025 20:51:55 +0000 (0:00:00.268) 0:00:29.247 ********** 2025-03-23 20:51:55.558262 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:55.559210 | orchestrator | 2025-03-23 20:51:55.561440 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:55.562755 | orchestrator | Sunday 23 March 2025 20:51:55 +0000 (0:00:00.289) 0:00:29.536 ********** 2025-03-23 20:51:55.820214 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:55.820494 | orchestrator | 2025-03-23 20:51:55.822273 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:51:55.823505 | orchestrator | Sunday 23 March 2025 20:51:55 +0000 (0:00:00.260) 0:00:29.797 ********** 2025-03-23 20:51:56.042000 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:56.042534 | orchestrator | 2025-03-23 20:51:56.043781 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-03-23 20:51:56.044509 | orchestrator | Sunday 23 March 2025 20:51:56 +0000 (0:00:00.223) 0:00:30.020 ********** 2025-03-23 20:51:56.234538 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': None}) 2025-03-23 20:51:56.234927 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': None}) 2025-03-23 20:51:56.235236 | orchestrator | 2025-03-23 20:51:56.235263 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-03-23 20:51:56.235422 | orchestrator | Sunday 23 March 2025 20:51:56 +0000 (0:00:00.192) 0:00:30.212 ********** 2025-03-23 20:51:56.384904 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:56.385829 | orchestrator | 2025-03-23 20:51:56.385858 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-03-23 20:51:56.386471 | orchestrator | Sunday 23 March 2025 20:51:56 +0000 (0:00:00.149) 0:00:30.362 ********** 2025-03-23 20:51:56.567029 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:56.567492 | orchestrator | 2025-03-23 20:51:56.568737 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-03-23 20:51:56.569387 | orchestrator | Sunday 23 March 2025 20:51:56 +0000 (0:00:00.181) 0:00:30.544 ********** 2025-03-23 20:51:56.730307 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:56.730836 | orchestrator | 2025-03-23 20:51:56.731941 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-03-23 20:51:56.733130 | orchestrator | Sunday 23 March 2025 20:51:56 +0000 (0:00:00.162) 0:00:30.706 ********** 2025-03-23 20:51:56.887703 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:51:56.888980 | orchestrator | 2025-03-23 20:51:56.890320 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-03-23 20:51:56.892231 | orchestrator | Sunday 23 March 2025 20:51:56 +0000 (0:00:00.157) 0:00:30.864 ********** 2025-03-23 20:51:57.098783 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'}}) 2025-03-23 20:51:57.099479 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '62e3fac0-87ec-50cc-8f44-41551697f65b'}}) 2025-03-23 20:51:57.100206 | orchestrator | 2025-03-23 20:51:57.100841 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-03-23 20:51:57.100922 | orchestrator | Sunday 23 March 2025 20:51:57 +0000 (0:00:00.211) 0:00:31.076 ********** 2025-03-23 20:51:57.470863 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'}})  2025-03-23 20:51:57.471427 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '62e3fac0-87ec-50cc-8f44-41551697f65b'}})  2025-03-23 20:51:57.472533 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:57.473376 | orchestrator | 2025-03-23 20:51:57.474383 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-03-23 20:51:57.475610 | orchestrator | Sunday 23 March 2025 20:51:57 +0000 (0:00:00.372) 0:00:31.448 ********** 2025-03-23 20:51:57.658132 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'}})  2025-03-23 20:51:57.659864 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '62e3fac0-87ec-50cc-8f44-41551697f65b'}})  2025-03-23 20:51:57.659904 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:57.660164 | orchestrator | 2025-03-23 20:51:57.660195 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-03-23 20:51:57.662545 | orchestrator | Sunday 23 March 2025 20:51:57 +0000 (0:00:00.186) 0:00:31.635 ********** 2025-03-23 20:51:57.875334 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'}})  2025-03-23 20:51:57.876457 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '62e3fac0-87ec-50cc-8f44-41551697f65b'}})  2025-03-23 20:51:57.876877 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:57.878655 | orchestrator | 2025-03-23 20:51:57.879195 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-03-23 20:51:57.879501 | orchestrator | Sunday 23 March 2025 20:51:57 +0000 (0:00:00.211) 0:00:31.847 ********** 2025-03-23 20:51:58.018427 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:51:58.018894 | orchestrator | 2025-03-23 20:51:58.020758 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-03-23 20:51:58.022217 | orchestrator | Sunday 23 March 2025 20:51:58 +0000 (0:00:00.149) 0:00:31.996 ********** 2025-03-23 20:51:58.174176 | orchestrator | ok: [testbed-node-4] 2025-03-23 20:51:58.174366 | orchestrator | 2025-03-23 20:51:58.174818 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-03-23 20:51:58.175236 | orchestrator | Sunday 23 March 2025 20:51:58 +0000 (0:00:00.152) 0:00:32.149 ********** 2025-03-23 20:51:58.300852 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:58.300980 | orchestrator | 2025-03-23 20:51:58.301005 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-03-23 20:51:58.301769 | orchestrator | Sunday 23 March 2025 20:51:58 +0000 (0:00:00.128) 0:00:32.278 ********** 2025-03-23 20:51:58.453809 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:58.457433 | orchestrator | 2025-03-23 20:51:58.599565 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-03-23 20:51:58.599697 | orchestrator | Sunday 23 March 2025 20:51:58 +0000 (0:00:00.152) 0:00:32.431 ********** 2025-03-23 20:51:58.599728 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:58.602317 | orchestrator | 2025-03-23 20:51:58.760272 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-03-23 20:51:58.760354 | orchestrator | Sunday 23 March 2025 20:51:58 +0000 (0:00:00.145) 0:00:32.577 ********** 2025-03-23 20:51:58.760382 | orchestrator | ok: [testbed-node-4] => { 2025-03-23 20:51:58.760924 | orchestrator |  "ceph_osd_devices": { 2025-03-23 20:51:58.761689 | orchestrator |  "sdb": { 2025-03-23 20:51:58.762567 | orchestrator |  "osd_lvm_uuid": "e8e09cee-3c3a-56a6-a7d5-937e6639e1b3" 2025-03-23 20:51:58.763512 | orchestrator |  }, 2025-03-23 20:51:58.765750 | orchestrator |  "sdc": { 2025-03-23 20:51:58.769112 | orchestrator |  "osd_lvm_uuid": "62e3fac0-87ec-50cc-8f44-41551697f65b" 2025-03-23 20:51:58.769150 | orchestrator |  } 2025-03-23 20:51:58.769235 | orchestrator |  } 2025-03-23 20:51:58.770061 | orchestrator | } 2025-03-23 20:51:58.770370 | orchestrator | 2025-03-23 20:51:58.770753 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-03-23 20:51:58.771169 | orchestrator | Sunday 23 March 2025 20:51:58 +0000 (0:00:00.161) 0:00:32.738 ********** 2025-03-23 20:51:58.910167 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:58.911017 | orchestrator | 2025-03-23 20:51:58.911181 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-03-23 20:51:58.912801 | orchestrator | Sunday 23 March 2025 20:51:58 +0000 (0:00:00.149) 0:00:32.888 ********** 2025-03-23 20:51:59.069608 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:59.070846 | orchestrator | 2025-03-23 20:51:59.071964 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-03-23 20:51:59.073050 | orchestrator | Sunday 23 March 2025 20:51:59 +0000 (0:00:00.160) 0:00:33.048 ********** 2025-03-23 20:51:59.510392 | orchestrator | skipping: [testbed-node-4] 2025-03-23 20:51:59.510859 | orchestrator | 2025-03-23 20:51:59.511431 | orchestrator | TASK [Print configuration data] ************************************************ 2025-03-23 20:51:59.511798 | orchestrator | Sunday 23 March 2025 20:51:59 +0000 (0:00:00.438) 0:00:33.487 ********** 2025-03-23 20:51:59.810327 | orchestrator | changed: [testbed-node-4] => { 2025-03-23 20:51:59.813869 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-03-23 20:51:59.815196 | orchestrator |  "ceph_osd_devices": { 2025-03-23 20:51:59.816906 | orchestrator |  "sdb": { 2025-03-23 20:51:59.817883 | orchestrator |  "osd_lvm_uuid": "e8e09cee-3c3a-56a6-a7d5-937e6639e1b3" 2025-03-23 20:51:59.818297 | orchestrator |  }, 2025-03-23 20:51:59.819317 | orchestrator |  "sdc": { 2025-03-23 20:51:59.820068 | orchestrator |  "osd_lvm_uuid": "62e3fac0-87ec-50cc-8f44-41551697f65b" 2025-03-23 20:51:59.820574 | orchestrator |  } 2025-03-23 20:51:59.821339 | orchestrator |  }, 2025-03-23 20:51:59.822145 | orchestrator |  "lvm_volumes": [ 2025-03-23 20:51:59.822726 | orchestrator |  { 2025-03-23 20:51:59.824407 | orchestrator |  "data": "osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3", 2025-03-23 20:51:59.824472 | orchestrator |  "data_vg": "ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3" 2025-03-23 20:51:59.825274 | orchestrator |  }, 2025-03-23 20:51:59.825300 | orchestrator |  { 2025-03-23 20:51:59.825318 | orchestrator |  "data": "osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b", 2025-03-23 20:51:59.825339 | orchestrator |  "data_vg": "ceph-62e3fac0-87ec-50cc-8f44-41551697f65b" 2025-03-23 20:51:59.826260 | orchestrator |  } 2025-03-23 20:51:59.826541 | orchestrator |  ] 2025-03-23 20:51:59.827189 | orchestrator |  } 2025-03-23 20:51:59.827714 | orchestrator | } 2025-03-23 20:51:59.827984 | orchestrator | 2025-03-23 20:51:59.828436 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-03-23 20:51:59.828959 | orchestrator | Sunday 23 March 2025 20:51:59 +0000 (0:00:00.300) 0:00:33.788 ********** 2025-03-23 20:52:01.299074 | orchestrator | changed: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-03-23 20:52:01.299325 | orchestrator | 2025-03-23 20:52:01.300610 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-03-23 20:52:01.301096 | orchestrator | 2025-03-23 20:52:01.301972 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-23 20:52:01.302687 | orchestrator | Sunday 23 March 2025 20:52:01 +0000 (0:00:01.487) 0:00:35.275 ********** 2025-03-23 20:52:01.999132 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-03-23 20:52:02.000708 | orchestrator | 2025-03-23 20:52:02.003914 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-23 20:52:02.004419 | orchestrator | Sunday 23 March 2025 20:52:01 +0000 (0:00:00.701) 0:00:35.977 ********** 2025-03-23 20:52:02.326575 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:52:02.327928 | orchestrator | 2025-03-23 20:52:02.328370 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:02.329437 | orchestrator | Sunday 23 March 2025 20:52:02 +0000 (0:00:00.326) 0:00:36.303 ********** 2025-03-23 20:52:02.752868 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2025-03-23 20:52:02.753535 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2025-03-23 20:52:02.758150 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2025-03-23 20:52:02.759456 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2025-03-23 20:52:02.759766 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2025-03-23 20:52:02.760811 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2025-03-23 20:52:02.761610 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2025-03-23 20:52:02.763496 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2025-03-23 20:52:02.763809 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2025-03-23 20:52:02.765068 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2025-03-23 20:52:02.765418 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2025-03-23 20:52:02.766916 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2025-03-23 20:52:02.767155 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2025-03-23 20:52:02.767873 | orchestrator | 2025-03-23 20:52:02.768864 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:02.769898 | orchestrator | Sunday 23 March 2025 20:52:02 +0000 (0:00:00.426) 0:00:36.730 ********** 2025-03-23 20:52:02.989986 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:02.990935 | orchestrator | 2025-03-23 20:52:02.992156 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:02.993116 | orchestrator | Sunday 23 March 2025 20:52:02 +0000 (0:00:00.234) 0:00:36.965 ********** 2025-03-23 20:52:03.210134 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:03.211500 | orchestrator | 2025-03-23 20:52:03.214150 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:03.215248 | orchestrator | Sunday 23 March 2025 20:52:03 +0000 (0:00:00.222) 0:00:37.187 ********** 2025-03-23 20:52:03.433919 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:03.434668 | orchestrator | 2025-03-23 20:52:03.436790 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:03.437608 | orchestrator | Sunday 23 March 2025 20:52:03 +0000 (0:00:00.218) 0:00:37.406 ********** 2025-03-23 20:52:03.666570 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:03.667974 | orchestrator | 2025-03-23 20:52:03.674410 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:03.675779 | orchestrator | Sunday 23 March 2025 20:52:03 +0000 (0:00:00.238) 0:00:37.645 ********** 2025-03-23 20:52:03.903179 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:03.904353 | orchestrator | 2025-03-23 20:52:03.908336 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:03.908551 | orchestrator | Sunday 23 March 2025 20:52:03 +0000 (0:00:00.234) 0:00:37.880 ********** 2025-03-23 20:52:04.166885 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:04.169867 | orchestrator | 2025-03-23 20:52:04.172854 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:04.173521 | orchestrator | Sunday 23 March 2025 20:52:04 +0000 (0:00:00.258) 0:00:38.138 ********** 2025-03-23 20:52:04.430745 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:04.431826 | orchestrator | 2025-03-23 20:52:04.433675 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:04.435322 | orchestrator | Sunday 23 March 2025 20:52:04 +0000 (0:00:00.269) 0:00:38.408 ********** 2025-03-23 20:52:05.334283 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:05.335156 | orchestrator | 2025-03-23 20:52:05.338072 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:05.338773 | orchestrator | Sunday 23 March 2025 20:52:05 +0000 (0:00:00.894) 0:00:39.302 ********** 2025-03-23 20:52:05.772512 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_2c3245a0-bb3b-48a6-962d-1b5b9b49262d) 2025-03-23 20:52:05.783259 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_2c3245a0-bb3b-48a6-962d-1b5b9b49262d) 2025-03-23 20:52:05.786592 | orchestrator | 2025-03-23 20:52:06.273224 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:06.273329 | orchestrator | Sunday 23 March 2025 20:52:05 +0000 (0:00:00.443) 0:00:39.746 ********** 2025-03-23 20:52:06.273365 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_1aae855b-0878-455b-beee-c51e17b854da) 2025-03-23 20:52:06.274302 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_1aae855b-0878-455b-beee-c51e17b854da) 2025-03-23 20:52:06.275324 | orchestrator | 2025-03-23 20:52:06.276330 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:06.277367 | orchestrator | Sunday 23 March 2025 20:52:06 +0000 (0:00:00.502) 0:00:40.249 ********** 2025-03-23 20:52:06.733898 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_4ae64758-af7a-4ee3-835e-8ab2b9979c52) 2025-03-23 20:52:06.735052 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_4ae64758-af7a-4ee3-835e-8ab2b9979c52) 2025-03-23 20:52:06.735088 | orchestrator | 2025-03-23 20:52:06.739837 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:07.238081 | orchestrator | Sunday 23 March 2025 20:52:06 +0000 (0:00:00.461) 0:00:40.710 ********** 2025-03-23 20:52:07.238194 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_558eae64-1e92-4eba-9b7a-c9f2592aca3c) 2025-03-23 20:52:07.239231 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_558eae64-1e92-4eba-9b7a-c9f2592aca3c) 2025-03-23 20:52:07.240108 | orchestrator | 2025-03-23 20:52:07.585619 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 20:52:07.585766 | orchestrator | Sunday 23 March 2025 20:52:07 +0000 (0:00:00.503) 0:00:41.214 ********** 2025-03-23 20:52:07.585797 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-23 20:52:07.587647 | orchestrator | 2025-03-23 20:52:07.590329 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:08.054928 | orchestrator | Sunday 23 March 2025 20:52:07 +0000 (0:00:00.347) 0:00:41.561 ********** 2025-03-23 20:52:08.055033 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2025-03-23 20:52:08.055462 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2025-03-23 20:52:08.056725 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2025-03-23 20:52:08.057166 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2025-03-23 20:52:08.060925 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2025-03-23 20:52:08.061157 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2025-03-23 20:52:08.061796 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2025-03-23 20:52:08.062704 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2025-03-23 20:52:08.063276 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2025-03-23 20:52:08.065897 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2025-03-23 20:52:08.066376 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2025-03-23 20:52:08.066994 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2025-03-23 20:52:08.069740 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2025-03-23 20:52:08.070338 | orchestrator | 2025-03-23 20:52:08.070458 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:08.071173 | orchestrator | Sunday 23 March 2025 20:52:08 +0000 (0:00:00.469) 0:00:42.031 ********** 2025-03-23 20:52:08.313844 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:08.314924 | orchestrator | 2025-03-23 20:52:08.315795 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:08.316350 | orchestrator | Sunday 23 March 2025 20:52:08 +0000 (0:00:00.260) 0:00:42.291 ********** 2025-03-23 20:52:08.515979 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:08.516165 | orchestrator | 2025-03-23 20:52:08.517348 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:08.517871 | orchestrator | Sunday 23 March 2025 20:52:08 +0000 (0:00:00.201) 0:00:42.493 ********** 2025-03-23 20:52:09.307990 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:09.309078 | orchestrator | 2025-03-23 20:52:09.309578 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:09.309615 | orchestrator | Sunday 23 March 2025 20:52:09 +0000 (0:00:00.792) 0:00:43.285 ********** 2025-03-23 20:52:09.537440 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:09.542470 | orchestrator | 2025-03-23 20:52:09.542717 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:09.543679 | orchestrator | Sunday 23 March 2025 20:52:09 +0000 (0:00:00.229) 0:00:43.514 ********** 2025-03-23 20:52:09.778476 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:09.779188 | orchestrator | 2025-03-23 20:52:09.779301 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:09.779381 | orchestrator | Sunday 23 March 2025 20:52:09 +0000 (0:00:00.240) 0:00:43.755 ********** 2025-03-23 20:52:10.025235 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:10.025468 | orchestrator | 2025-03-23 20:52:10.028783 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:10.249536 | orchestrator | Sunday 23 March 2025 20:52:10 +0000 (0:00:00.243) 0:00:43.998 ********** 2025-03-23 20:52:10.249610 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:10.250726 | orchestrator | 2025-03-23 20:52:10.252351 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:10.259323 | orchestrator | Sunday 23 March 2025 20:52:10 +0000 (0:00:00.228) 0:00:44.227 ********** 2025-03-23 20:52:10.565205 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:10.566115 | orchestrator | 2025-03-23 20:52:10.566897 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:10.568067 | orchestrator | Sunday 23 March 2025 20:52:10 +0000 (0:00:00.315) 0:00:44.542 ********** 2025-03-23 20:52:11.426320 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2025-03-23 20:52:11.426696 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2025-03-23 20:52:11.427415 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2025-03-23 20:52:11.427905 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2025-03-23 20:52:11.432022 | orchestrator | 2025-03-23 20:52:11.643202 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:11.643317 | orchestrator | Sunday 23 March 2025 20:52:11 +0000 (0:00:00.861) 0:00:45.403 ********** 2025-03-23 20:52:11.643353 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:11.644799 | orchestrator | 2025-03-23 20:52:11.648658 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:11.861091 | orchestrator | Sunday 23 March 2025 20:52:11 +0000 (0:00:00.216) 0:00:45.620 ********** 2025-03-23 20:52:11.861190 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:11.862069 | orchestrator | 2025-03-23 20:52:11.863295 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:11.863789 | orchestrator | Sunday 23 March 2025 20:52:11 +0000 (0:00:00.218) 0:00:45.839 ********** 2025-03-23 20:52:12.092789 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:12.096094 | orchestrator | 2025-03-23 20:52:12.318320 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 20:52:12.318408 | orchestrator | Sunday 23 March 2025 20:52:12 +0000 (0:00:00.227) 0:00:46.066 ********** 2025-03-23 20:52:12.318440 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:12.318508 | orchestrator | 2025-03-23 20:52:12.319125 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-03-23 20:52:12.321303 | orchestrator | Sunday 23 March 2025 20:52:12 +0000 (0:00:00.230) 0:00:46.296 ********** 2025-03-23 20:52:12.783660 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': None}) 2025-03-23 20:52:12.784742 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': None}) 2025-03-23 20:52:12.787324 | orchestrator | 2025-03-23 20:52:12.788938 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-03-23 20:52:12.789791 | orchestrator | Sunday 23 March 2025 20:52:12 +0000 (0:00:00.463) 0:00:46.759 ********** 2025-03-23 20:52:12.952153 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:12.952704 | orchestrator | 2025-03-23 20:52:12.953193 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-03-23 20:52:12.954108 | orchestrator | Sunday 23 March 2025 20:52:12 +0000 (0:00:00.169) 0:00:46.929 ********** 2025-03-23 20:52:13.130840 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:13.135343 | orchestrator | 2025-03-23 20:52:13.136412 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-03-23 20:52:13.137739 | orchestrator | Sunday 23 March 2025 20:52:13 +0000 (0:00:00.178) 0:00:47.108 ********** 2025-03-23 20:52:13.307650 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:13.308825 | orchestrator | 2025-03-23 20:52:13.310010 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-03-23 20:52:13.310606 | orchestrator | Sunday 23 March 2025 20:52:13 +0000 (0:00:00.176) 0:00:47.285 ********** 2025-03-23 20:52:13.472702 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:52:13.476975 | orchestrator | 2025-03-23 20:52:13.478524 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-03-23 20:52:13.479730 | orchestrator | Sunday 23 March 2025 20:52:13 +0000 (0:00:00.162) 0:00:47.447 ********** 2025-03-23 20:52:13.673616 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'}}) 2025-03-23 20:52:13.674796 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a4d93e20-7e7b-5457-96cd-ba2e435e9438'}}) 2025-03-23 20:52:13.674824 | orchestrator | 2025-03-23 20:52:13.674845 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-03-23 20:52:13.676129 | orchestrator | Sunday 23 March 2025 20:52:13 +0000 (0:00:00.197) 0:00:47.645 ********** 2025-03-23 20:52:13.840495 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'}})  2025-03-23 20:52:13.841257 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a4d93e20-7e7b-5457-96cd-ba2e435e9438'}})  2025-03-23 20:52:13.841921 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:13.842832 | orchestrator | 2025-03-23 20:52:13.843499 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-03-23 20:52:13.844001 | orchestrator | Sunday 23 March 2025 20:52:13 +0000 (0:00:00.173) 0:00:47.818 ********** 2025-03-23 20:52:14.106097 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'}})  2025-03-23 20:52:14.107110 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a4d93e20-7e7b-5457-96cd-ba2e435e9438'}})  2025-03-23 20:52:14.107729 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:14.108848 | orchestrator | 2025-03-23 20:52:14.110134 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-03-23 20:52:14.110693 | orchestrator | Sunday 23 March 2025 20:52:14 +0000 (0:00:00.263) 0:00:48.082 ********** 2025-03-23 20:52:14.287561 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'}})  2025-03-23 20:52:14.288858 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a4d93e20-7e7b-5457-96cd-ba2e435e9438'}})  2025-03-23 20:52:14.291956 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:14.293214 | orchestrator | 2025-03-23 20:52:14.294435 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-03-23 20:52:14.295918 | orchestrator | Sunday 23 March 2025 20:52:14 +0000 (0:00:00.181) 0:00:48.264 ********** 2025-03-23 20:52:14.465679 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:52:14.465856 | orchestrator | 2025-03-23 20:52:14.466431 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-03-23 20:52:14.466897 | orchestrator | Sunday 23 March 2025 20:52:14 +0000 (0:00:00.179) 0:00:48.443 ********** 2025-03-23 20:52:14.621481 | orchestrator | ok: [testbed-node-5] 2025-03-23 20:52:14.621618 | orchestrator | 2025-03-23 20:52:14.622246 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-03-23 20:52:14.623383 | orchestrator | Sunday 23 March 2025 20:52:14 +0000 (0:00:00.154) 0:00:48.597 ********** 2025-03-23 20:52:15.019781 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:15.020592 | orchestrator | 2025-03-23 20:52:15.020849 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-03-23 20:52:15.022160 | orchestrator | Sunday 23 March 2025 20:52:15 +0000 (0:00:00.399) 0:00:48.997 ********** 2025-03-23 20:52:15.178767 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:15.182534 | orchestrator | 2025-03-23 20:52:15.183539 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-03-23 20:52:15.184360 | orchestrator | Sunday 23 March 2025 20:52:15 +0000 (0:00:00.157) 0:00:49.155 ********** 2025-03-23 20:52:15.367187 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:15.367674 | orchestrator | 2025-03-23 20:52:15.367926 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-03-23 20:52:15.368681 | orchestrator | Sunday 23 March 2025 20:52:15 +0000 (0:00:00.189) 0:00:49.344 ********** 2025-03-23 20:52:15.555926 | orchestrator | ok: [testbed-node-5] => { 2025-03-23 20:52:15.559030 | orchestrator |  "ceph_osd_devices": { 2025-03-23 20:52:15.559804 | orchestrator |  "sdb": { 2025-03-23 20:52:15.560680 | orchestrator |  "osd_lvm_uuid": "1b64d1ac-aa1a-58dd-947f-80f4fb53d79b" 2025-03-23 20:52:15.562747 | orchestrator |  }, 2025-03-23 20:52:15.563024 | orchestrator |  "sdc": { 2025-03-23 20:52:15.563821 | orchestrator |  "osd_lvm_uuid": "a4d93e20-7e7b-5457-96cd-ba2e435e9438" 2025-03-23 20:52:15.564384 | orchestrator |  } 2025-03-23 20:52:15.564755 | orchestrator |  } 2025-03-23 20:52:15.564974 | orchestrator | } 2025-03-23 20:52:15.565415 | orchestrator | 2025-03-23 20:52:15.565813 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-03-23 20:52:15.566750 | orchestrator | Sunday 23 March 2025 20:52:15 +0000 (0:00:00.190) 0:00:49.535 ********** 2025-03-23 20:52:15.699682 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:15.699795 | orchestrator | 2025-03-23 20:52:15.700215 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-03-23 20:52:15.700331 | orchestrator | Sunday 23 March 2025 20:52:15 +0000 (0:00:00.142) 0:00:49.677 ********** 2025-03-23 20:52:15.841484 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:15.841974 | orchestrator | 2025-03-23 20:52:15.843151 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-03-23 20:52:15.843973 | orchestrator | Sunday 23 March 2025 20:52:15 +0000 (0:00:00.141) 0:00:49.819 ********** 2025-03-23 20:52:15.985148 | orchestrator | skipping: [testbed-node-5] 2025-03-23 20:52:15.985681 | orchestrator | 2025-03-23 20:52:15.985726 | orchestrator | TASK [Print configuration data] ************************************************ 2025-03-23 20:52:15.986148 | orchestrator | Sunday 23 March 2025 20:52:15 +0000 (0:00:00.143) 0:00:49.963 ********** 2025-03-23 20:52:16.278922 | orchestrator | changed: [testbed-node-5] => { 2025-03-23 20:52:16.280060 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-03-23 20:52:16.281589 | orchestrator |  "ceph_osd_devices": { 2025-03-23 20:52:16.282516 | orchestrator |  "sdb": { 2025-03-23 20:52:16.283842 | orchestrator |  "osd_lvm_uuid": "1b64d1ac-aa1a-58dd-947f-80f4fb53d79b" 2025-03-23 20:52:16.284800 | orchestrator |  }, 2025-03-23 20:52:16.285776 | orchestrator |  "sdc": { 2025-03-23 20:52:16.286908 | orchestrator |  "osd_lvm_uuid": "a4d93e20-7e7b-5457-96cd-ba2e435e9438" 2025-03-23 20:52:16.288602 | orchestrator |  } 2025-03-23 20:52:16.288666 | orchestrator |  }, 2025-03-23 20:52:16.289314 | orchestrator |  "lvm_volumes": [ 2025-03-23 20:52:16.290062 | orchestrator |  { 2025-03-23 20:52:16.290685 | orchestrator |  "data": "osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b", 2025-03-23 20:52:16.291110 | orchestrator |  "data_vg": "ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b" 2025-03-23 20:52:16.291673 | orchestrator |  }, 2025-03-23 20:52:16.292388 | orchestrator |  { 2025-03-23 20:52:16.292670 | orchestrator |  "data": "osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438", 2025-03-23 20:52:16.293316 | orchestrator |  "data_vg": "ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438" 2025-03-23 20:52:16.293785 | orchestrator |  } 2025-03-23 20:52:16.294300 | orchestrator |  ] 2025-03-23 20:52:16.294802 | orchestrator |  } 2025-03-23 20:52:16.295339 | orchestrator | } 2025-03-23 20:52:16.295948 | orchestrator | 2025-03-23 20:52:16.296620 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-03-23 20:52:16.296919 | orchestrator | Sunday 23 March 2025 20:52:16 +0000 (0:00:00.291) 0:00:50.254 ********** 2025-03-23 20:52:17.761144 | orchestrator | changed: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-03-23 20:52:17.764761 | orchestrator | 2025-03-23 20:52:17.764813 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 20:52:17.764827 | orchestrator | 2025-03-23 20:52:17 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 20:52:17.765706 | orchestrator | 2025-03-23 20:52:17 | INFO  | Please wait and do not abort execution. 2025-03-23 20:52:17.765724 | orchestrator | testbed-node-3 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-03-23 20:52:17.766748 | orchestrator | testbed-node-4 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-03-23 20:52:17.767686 | orchestrator | testbed-node-5 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-03-23 20:52:17.768590 | orchestrator | 2025-03-23 20:52:17.769165 | orchestrator | 2025-03-23 20:52:17.769826 | orchestrator | 2025-03-23 20:52:17.770475 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-23 20:52:17.771289 | orchestrator | Sunday 23 March 2025 20:52:17 +0000 (0:00:01.483) 0:00:51.737 ********** 2025-03-23 20:52:17.771553 | orchestrator | =============================================================================== 2025-03-23 20:52:17.772900 | orchestrator | Write configuration file ------------------------------------------------ 5.55s 2025-03-23 20:52:17.773435 | orchestrator | Add known links to the list of available block devices ------------------ 1.68s 2025-03-23 20:52:17.773730 | orchestrator | Add known partitions to the list of available block devices ------------- 1.45s 2025-03-23 20:52:17.774394 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 1.40s 2025-03-23 20:52:17.774932 | orchestrator | Add known links to the list of available block devices ------------------ 1.38s 2025-03-23 20:52:17.775468 | orchestrator | Print configuration data ------------------------------------------------ 0.98s 2025-03-23 20:52:17.775720 | orchestrator | Add known partitions to the list of available block devices ------------- 0.95s 2025-03-23 20:52:17.776118 | orchestrator | Add known links to the list of available block devices ------------------ 0.89s 2025-03-23 20:52:17.776476 | orchestrator | Get initial list of available block devices ----------------------------- 0.87s 2025-03-23 20:52:17.776754 | orchestrator | Add known partitions to the list of available block devices ------------- 0.86s 2025-03-23 20:52:17.777133 | orchestrator | Add known links to the list of available block devices ------------------ 0.84s 2025-03-23 20:52:17.777601 | orchestrator | Set UUIDs for OSD VGs/LVs ----------------------------------------------- 0.83s 2025-03-23 20:52:17.777912 | orchestrator | Add known links to the list of available block devices ------------------ 0.82s 2025-03-23 20:52:17.778166 | orchestrator | Add known partitions to the list of available block devices ------------- 0.82s 2025-03-23 20:52:17.778638 | orchestrator | Add known partitions to the list of available block devices ------------- 0.79s 2025-03-23 20:52:17.778765 | orchestrator | Add known links to the list of available block devices ------------------ 0.78s 2025-03-23 20:52:17.779119 | orchestrator | Add known links to the list of available block devices ------------------ 0.78s 2025-03-23 20:52:17.779416 | orchestrator | Print shared DB/WAL devices --------------------------------------------- 0.77s 2025-03-23 20:52:17.779873 | orchestrator | Add known links to the list of available block devices ------------------ 0.77s 2025-03-23 20:52:17.780187 | orchestrator | Generate lvm_volumes structure (block + db) ----------------------------- 0.70s 2025-03-23 20:52:30.097130 | orchestrator | 2025-03-23 20:52:30 | INFO  | Task b1f83a38-4ac9-4f81-a53c-5f78861df4b2 is running in background. Output coming soon. 2025-03-23 21:52:32.421426 | orchestrator | 2025-03-23 21:52:32 | INFO  | Task dc37fb0c-9cfa-473f-b099-3bb7d74a47e1 (ceph-create-lvm-devices) was prepared for execution. 2025-03-23 21:52:35.542138 | orchestrator | 2025-03-23 21:52:32 | INFO  | It takes a moment until task dc37fb0c-9cfa-473f-b099-3bb7d74a47e1 (ceph-create-lvm-devices) has been started and output is visible here. 2025-03-23 21:52:35.542252 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.12 2025-03-23 21:52:36.050716 | orchestrator | 2025-03-23 21:52:36.050972 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-03-23 21:52:36.051722 | orchestrator | 2025-03-23 21:52:36.052445 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-23 21:52:36.053074 | orchestrator | Sunday 23 March 2025 21:52:36 +0000 (0:00:00.439) 0:00:00.439 ********** 2025-03-23 21:52:36.321748 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-03-23 21:52:36.323254 | orchestrator | 2025-03-23 21:52:36.323736 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-23 21:52:36.325657 | orchestrator | Sunday 23 March 2025 21:52:36 +0000 (0:00:00.270) 0:00:00.710 ********** 2025-03-23 21:52:36.580436 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:52:36.581879 | orchestrator | 2025-03-23 21:52:36.585201 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:36.586674 | orchestrator | Sunday 23 March 2025 21:52:36 +0000 (0:00:00.259) 0:00:00.969 ********** 2025-03-23 21:52:37.353722 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2025-03-23 21:52:37.354197 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2025-03-23 21:52:37.355272 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2025-03-23 21:52:37.356591 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2025-03-23 21:52:37.358230 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2025-03-23 21:52:37.358609 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2025-03-23 21:52:37.359777 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2025-03-23 21:52:37.360132 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2025-03-23 21:52:37.361127 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2025-03-23 21:52:37.362165 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2025-03-23 21:52:37.362596 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2025-03-23 21:52:37.363040 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2025-03-23 21:52:37.363662 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2025-03-23 21:52:37.364254 | orchestrator | 2025-03-23 21:52:37.364715 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:37.365034 | orchestrator | Sunday 23 March 2025 21:52:37 +0000 (0:00:00.773) 0:00:01.743 ********** 2025-03-23 21:52:37.564703 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:37.566526 | orchestrator | 2025-03-23 21:52:37.566637 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:37.567335 | orchestrator | Sunday 23 March 2025 21:52:37 +0000 (0:00:00.210) 0:00:01.954 ********** 2025-03-23 21:52:37.781927 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:37.782753 | orchestrator | 2025-03-23 21:52:37.783443 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:37.784162 | orchestrator | Sunday 23 March 2025 21:52:37 +0000 (0:00:00.215) 0:00:02.169 ********** 2025-03-23 21:52:38.007905 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:38.008473 | orchestrator | 2025-03-23 21:52:38.009251 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:38.010068 | orchestrator | Sunday 23 March 2025 21:52:38 +0000 (0:00:00.228) 0:00:02.398 ********** 2025-03-23 21:52:38.230324 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:38.230450 | orchestrator | 2025-03-23 21:52:38.231387 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:38.231837 | orchestrator | Sunday 23 March 2025 21:52:38 +0000 (0:00:00.220) 0:00:02.618 ********** 2025-03-23 21:52:38.446395 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:38.448300 | orchestrator | 2025-03-23 21:52:38.449862 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:38.450626 | orchestrator | Sunday 23 March 2025 21:52:38 +0000 (0:00:00.217) 0:00:02.836 ********** 2025-03-23 21:52:38.657363 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:38.657838 | orchestrator | 2025-03-23 21:52:38.658615 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:38.659840 | orchestrator | Sunday 23 March 2025 21:52:38 +0000 (0:00:00.209) 0:00:03.046 ********** 2025-03-23 21:52:38.881818 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:38.883603 | orchestrator | 2025-03-23 21:52:38.884464 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:38.885356 | orchestrator | Sunday 23 March 2025 21:52:38 +0000 (0:00:00.225) 0:00:03.271 ********** 2025-03-23 21:52:39.105283 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:39.105925 | orchestrator | 2025-03-23 21:52:39.106971 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:39.107907 | orchestrator | Sunday 23 March 2025 21:52:39 +0000 (0:00:00.224) 0:00:03.495 ********** 2025-03-23 21:52:39.754961 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_dcc46cc2-9048-4c81-bc2f-465e491970df) 2025-03-23 21:52:39.755677 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_dcc46cc2-9048-4c81-bc2f-465e491970df) 2025-03-23 21:52:39.757410 | orchestrator | 2025-03-23 21:52:39.760249 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:39.760882 | orchestrator | Sunday 23 March 2025 21:52:39 +0000 (0:00:00.648) 0:00:04.144 ********** 2025-03-23 21:52:40.613068 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_ed5da3c7-1374-4bfc-b341-605cae6b6ed5) 2025-03-23 21:52:40.613351 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_ed5da3c7-1374-4bfc-b341-605cae6b6ed5) 2025-03-23 21:52:40.613386 | orchestrator | 2025-03-23 21:52:40.613405 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:40.613427 | orchestrator | Sunday 23 March 2025 21:52:40 +0000 (0:00:00.858) 0:00:05.003 ********** 2025-03-23 21:52:41.093995 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_ce372efe-ce53-40a7-9ab4-8764278391af) 2025-03-23 21:52:41.096194 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_ce372efe-ce53-40a7-9ab4-8764278391af) 2025-03-23 21:52:41.096308 | orchestrator | 2025-03-23 21:52:41.096380 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:41.096710 | orchestrator | Sunday 23 March 2025 21:52:41 +0000 (0:00:00.475) 0:00:05.478 ********** 2025-03-23 21:52:41.536757 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_456a620e-4d8a-4da0-8fad-68a9dae98a07) 2025-03-23 21:52:41.537450 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_456a620e-4d8a-4da0-8fad-68a9dae98a07) 2025-03-23 21:52:41.538540 | orchestrator | 2025-03-23 21:52:41.540125 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:52:41.542758 | orchestrator | Sunday 23 March 2025 21:52:41 +0000 (0:00:00.447) 0:00:05.926 ********** 2025-03-23 21:52:41.901713 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-23 21:52:41.904192 | orchestrator | 2025-03-23 21:52:41.904270 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:41.904293 | orchestrator | Sunday 23 March 2025 21:52:41 +0000 (0:00:00.363) 0:00:06.289 ********** 2025-03-23 21:52:42.429309 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2025-03-23 21:52:42.429846 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2025-03-23 21:52:42.429886 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2025-03-23 21:52:42.430191 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2025-03-23 21:52:42.430725 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2025-03-23 21:52:42.430830 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2025-03-23 21:52:42.431664 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2025-03-23 21:52:42.432046 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2025-03-23 21:52:42.432657 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2025-03-23 21:52:42.433130 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2025-03-23 21:52:42.433411 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2025-03-23 21:52:42.436462 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2025-03-23 21:52:42.436698 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2025-03-23 21:52:42.437588 | orchestrator | 2025-03-23 21:52:42.437833 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:42.438583 | orchestrator | Sunday 23 March 2025 21:52:42 +0000 (0:00:00.530) 0:00:06.820 ********** 2025-03-23 21:52:42.642773 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:42.644604 | orchestrator | 2025-03-23 21:52:42.645649 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:42.646712 | orchestrator | Sunday 23 March 2025 21:52:42 +0000 (0:00:00.212) 0:00:07.033 ********** 2025-03-23 21:52:42.865551 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:42.865802 | orchestrator | 2025-03-23 21:52:42.866673 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:42.867390 | orchestrator | Sunday 23 March 2025 21:52:42 +0000 (0:00:00.221) 0:00:07.254 ********** 2025-03-23 21:52:43.086738 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:43.087884 | orchestrator | 2025-03-23 21:52:43.090344 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:43.098656 | orchestrator | Sunday 23 March 2025 21:52:43 +0000 (0:00:00.222) 0:00:07.476 ********** 2025-03-23 21:52:43.305971 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:43.306701 | orchestrator | 2025-03-23 21:52:43.308538 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:43.310740 | orchestrator | Sunday 23 March 2025 21:52:43 +0000 (0:00:00.218) 0:00:07.695 ********** 2025-03-23 21:52:43.951423 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:43.952038 | orchestrator | 2025-03-23 21:52:43.952740 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:43.954869 | orchestrator | Sunday 23 March 2025 21:52:43 +0000 (0:00:00.643) 0:00:08.340 ********** 2025-03-23 21:52:44.163522 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:44.164354 | orchestrator | 2025-03-23 21:52:44.164397 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:44.165505 | orchestrator | Sunday 23 March 2025 21:52:44 +0000 (0:00:00.212) 0:00:08.552 ********** 2025-03-23 21:52:44.396051 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:44.397294 | orchestrator | 2025-03-23 21:52:44.400039 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:44.611999 | orchestrator | Sunday 23 March 2025 21:52:44 +0000 (0:00:00.233) 0:00:08.786 ********** 2025-03-23 21:52:44.612085 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:44.612951 | orchestrator | 2025-03-23 21:52:44.615777 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:45.320860 | orchestrator | Sunday 23 March 2025 21:52:44 +0000 (0:00:00.215) 0:00:09.002 ********** 2025-03-23 21:52:45.320978 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2025-03-23 21:52:45.322962 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2025-03-23 21:52:45.323150 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2025-03-23 21:52:45.323173 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2025-03-23 21:52:45.323189 | orchestrator | 2025-03-23 21:52:45.323210 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:45.323609 | orchestrator | Sunday 23 March 2025 21:52:45 +0000 (0:00:00.706) 0:00:09.708 ********** 2025-03-23 21:52:45.564250 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:45.567088 | orchestrator | 2025-03-23 21:52:45.568204 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:45.569364 | orchestrator | Sunday 23 March 2025 21:52:45 +0000 (0:00:00.246) 0:00:09.954 ********** 2025-03-23 21:52:45.763073 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:45.763905 | orchestrator | 2025-03-23 21:52:45.765110 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:45.766691 | orchestrator | Sunday 23 March 2025 21:52:45 +0000 (0:00:00.197) 0:00:10.152 ********** 2025-03-23 21:52:46.014420 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:46.015756 | orchestrator | 2025-03-23 21:52:46.016915 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:52:46.017940 | orchestrator | Sunday 23 March 2025 21:52:46 +0000 (0:00:00.252) 0:00:10.404 ********** 2025-03-23 21:52:46.212450 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:46.212885 | orchestrator | 2025-03-23 21:52:46.215736 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-03-23 21:52:46.216741 | orchestrator | Sunday 23 March 2025 21:52:46 +0000 (0:00:00.198) 0:00:10.603 ********** 2025-03-23 21:52:46.359621 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:46.361415 | orchestrator | 2025-03-23 21:52:46.361789 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-03-23 21:52:46.365443 | orchestrator | Sunday 23 March 2025 21:52:46 +0000 (0:00:00.146) 0:00:10.750 ********** 2025-03-23 21:52:46.577720 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'a8bbe70e-ef9b-5e78-a477-05274116adef'}}) 2025-03-23 21:52:46.578189 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '7ba69cd0-48cc-55e3-9649-a43d5cfb8428'}}) 2025-03-23 21:52:46.579838 | orchestrator | 2025-03-23 21:52:46.582159 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-03-23 21:52:49.187859 | orchestrator | Sunday 23 March 2025 21:52:46 +0000 (0:00:00.217) 0:00:10.967 ********** 2025-03-23 21:52:49.187986 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'}) 2025-03-23 21:52:49.189831 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'}) 2025-03-23 21:52:49.189890 | orchestrator | 2025-03-23 21:52:49.192486 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-03-23 21:52:49.193669 | orchestrator | Sunday 23 March 2025 21:52:49 +0000 (0:00:02.606) 0:00:13.574 ********** 2025-03-23 21:52:49.359655 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:49.363172 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:49.364377 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:49.365315 | orchestrator | 2025-03-23 21:52:49.366480 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-03-23 21:52:49.368119 | orchestrator | Sunday 23 March 2025 21:52:49 +0000 (0:00:00.175) 0:00:13.749 ********** 2025-03-23 21:52:50.984909 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'}) 2025-03-23 21:52:50.985196 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'}) 2025-03-23 21:52:50.985231 | orchestrator | 2025-03-23 21:52:50.986123 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-03-23 21:52:50.986290 | orchestrator | Sunday 23 March 2025 21:52:50 +0000 (0:00:01.623) 0:00:15.373 ********** 2025-03-23 21:52:51.144033 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:51.144291 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:51.145449 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:51.146718 | orchestrator | 2025-03-23 21:52:51.146791 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-03-23 21:52:51.148845 | orchestrator | Sunday 23 March 2025 21:52:51 +0000 (0:00:00.160) 0:00:15.534 ********** 2025-03-23 21:52:51.292795 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:51.295258 | orchestrator | 2025-03-23 21:52:51.295418 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-03-23 21:52:51.295478 | orchestrator | Sunday 23 March 2025 21:52:51 +0000 (0:00:00.146) 0:00:15.680 ********** 2025-03-23 21:52:51.465938 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:51.469254 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:51.470715 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:51.471619 | orchestrator | 2025-03-23 21:52:51.472786 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-03-23 21:52:51.473518 | orchestrator | Sunday 23 March 2025 21:52:51 +0000 (0:00:00.175) 0:00:15.856 ********** 2025-03-23 21:52:51.624890 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:51.625186 | orchestrator | 2025-03-23 21:52:51.626193 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-03-23 21:52:51.627086 | orchestrator | Sunday 23 March 2025 21:52:51 +0000 (0:00:00.158) 0:00:16.014 ********** 2025-03-23 21:52:51.810605 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:51.811728 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:51.812802 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:51.814067 | orchestrator | 2025-03-23 21:52:51.815187 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-03-23 21:52:51.816524 | orchestrator | Sunday 23 March 2025 21:52:51 +0000 (0:00:00.183) 0:00:16.197 ********** 2025-03-23 21:52:52.139170 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:52.139930 | orchestrator | 2025-03-23 21:52:52.140796 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-03-23 21:52:52.141679 | orchestrator | Sunday 23 March 2025 21:52:52 +0000 (0:00:00.332) 0:00:16.530 ********** 2025-03-23 21:52:52.340375 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:52.340636 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:52.343023 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:52.499506 | orchestrator | 2025-03-23 21:52:52.499610 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-03-23 21:52:52.499668 | orchestrator | Sunday 23 March 2025 21:52:52 +0000 (0:00:00.198) 0:00:16.729 ********** 2025-03-23 21:52:52.499695 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:52:52.499764 | orchestrator | 2025-03-23 21:52:52.500649 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-03-23 21:52:52.501472 | orchestrator | Sunday 23 March 2025 21:52:52 +0000 (0:00:00.160) 0:00:16.889 ********** 2025-03-23 21:52:52.682864 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:52.684050 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:52.684846 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:52.688021 | orchestrator | 2025-03-23 21:52:52.690279 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-03-23 21:52:52.694758 | orchestrator | Sunday 23 March 2025 21:52:52 +0000 (0:00:00.183) 0:00:17.073 ********** 2025-03-23 21:52:52.853142 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:52.853770 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:52.853838 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:52.853890 | orchestrator | 2025-03-23 21:52:52.854403 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-03-23 21:52:52.855391 | orchestrator | Sunday 23 March 2025 21:52:52 +0000 (0:00:00.169) 0:00:17.242 ********** 2025-03-23 21:52:53.017490 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:53.017750 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:53.018224 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:53.018830 | orchestrator | 2025-03-23 21:52:53.020988 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-03-23 21:52:53.023211 | orchestrator | Sunday 23 March 2025 21:52:53 +0000 (0:00:00.165) 0:00:17.408 ********** 2025-03-23 21:52:53.159224 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:53.160521 | orchestrator | 2025-03-23 21:52:53.161433 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-03-23 21:52:53.162134 | orchestrator | Sunday 23 March 2025 21:52:53 +0000 (0:00:00.141) 0:00:17.550 ********** 2025-03-23 21:52:53.301211 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:53.301381 | orchestrator | 2025-03-23 21:52:53.304306 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-03-23 21:52:53.453683 | orchestrator | Sunday 23 March 2025 21:52:53 +0000 (0:00:00.139) 0:00:17.689 ********** 2025-03-23 21:52:53.453774 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:53.453861 | orchestrator | 2025-03-23 21:52:53.455172 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-03-23 21:52:53.605543 | orchestrator | Sunday 23 March 2025 21:52:53 +0000 (0:00:00.153) 0:00:17.842 ********** 2025-03-23 21:52:53.605659 | orchestrator | ok: [testbed-node-3] => { 2025-03-23 21:52:53.607176 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-03-23 21:52:53.609444 | orchestrator | } 2025-03-23 21:52:53.767938 | orchestrator | 2025-03-23 21:52:53.767984 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-03-23 21:52:53.768018 | orchestrator | Sunday 23 March 2025 21:52:53 +0000 (0:00:00.152) 0:00:17.995 ********** 2025-03-23 21:52:53.768040 | orchestrator | ok: [testbed-node-3] => { 2025-03-23 21:52:53.769167 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-03-23 21:52:53.769702 | orchestrator | } 2025-03-23 21:52:53.770726 | orchestrator | 2025-03-23 21:52:53.771212 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-03-23 21:52:53.772184 | orchestrator | Sunday 23 March 2025 21:52:53 +0000 (0:00:00.162) 0:00:18.157 ********** 2025-03-23 21:52:53.926862 | orchestrator | ok: [testbed-node-3] => { 2025-03-23 21:52:53.928085 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-03-23 21:52:53.930726 | orchestrator | } 2025-03-23 21:52:53.931858 | orchestrator | 2025-03-23 21:52:53.931887 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-03-23 21:52:53.933119 | orchestrator | Sunday 23 March 2025 21:52:53 +0000 (0:00:00.158) 0:00:18.316 ********** 2025-03-23 21:52:54.884911 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:52:54.885143 | orchestrator | 2025-03-23 21:52:54.885794 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-03-23 21:52:54.886282 | orchestrator | Sunday 23 March 2025 21:52:54 +0000 (0:00:00.959) 0:00:19.275 ********** 2025-03-23 21:52:55.462399 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:52:55.464643 | orchestrator | 2025-03-23 21:52:55.466112 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-03-23 21:52:55.469136 | orchestrator | Sunday 23 March 2025 21:52:55 +0000 (0:00:00.575) 0:00:19.851 ********** 2025-03-23 21:52:56.048819 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:52:56.050322 | orchestrator | 2025-03-23 21:52:56.051292 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-03-23 21:52:56.054172 | orchestrator | Sunday 23 March 2025 21:52:56 +0000 (0:00:00.586) 0:00:20.438 ********** 2025-03-23 21:52:56.219855 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:52:56.220267 | orchestrator | 2025-03-23 21:52:56.220682 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-03-23 21:52:56.221485 | orchestrator | Sunday 23 March 2025 21:52:56 +0000 (0:00:00.171) 0:00:20.609 ********** 2025-03-23 21:52:56.354486 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:56.356418 | orchestrator | 2025-03-23 21:52:56.357390 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-03-23 21:52:56.358294 | orchestrator | Sunday 23 March 2025 21:52:56 +0000 (0:00:00.132) 0:00:20.742 ********** 2025-03-23 21:52:56.475409 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:56.476392 | orchestrator | 2025-03-23 21:52:56.477642 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-03-23 21:52:56.478470 | orchestrator | Sunday 23 March 2025 21:52:56 +0000 (0:00:00.120) 0:00:20.863 ********** 2025-03-23 21:52:56.639864 | orchestrator | ok: [testbed-node-3] => { 2025-03-23 21:52:56.640544 | orchestrator |  "vgs_report": { 2025-03-23 21:52:56.640650 | orchestrator |  "vg": [] 2025-03-23 21:52:56.641746 | orchestrator |  } 2025-03-23 21:52:56.642193 | orchestrator | } 2025-03-23 21:52:56.642655 | orchestrator | 2025-03-23 21:52:56.643179 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-03-23 21:52:56.789537 | orchestrator | Sunday 23 March 2025 21:52:56 +0000 (0:00:00.167) 0:00:21.031 ********** 2025-03-23 21:52:56.789660 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:56.789742 | orchestrator | 2025-03-23 21:52:56.789765 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-03-23 21:52:56.789992 | orchestrator | Sunday 23 March 2025 21:52:56 +0000 (0:00:00.145) 0:00:21.177 ********** 2025-03-23 21:52:56.921941 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:56.922376 | orchestrator | 2025-03-23 21:52:56.922743 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-03-23 21:52:56.923198 | orchestrator | Sunday 23 March 2025 21:52:56 +0000 (0:00:00.136) 0:00:21.313 ********** 2025-03-23 21:52:57.080005 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:57.080128 | orchestrator | 2025-03-23 21:52:57.080145 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-03-23 21:52:57.080328 | orchestrator | Sunday 23 March 2025 21:52:57 +0000 (0:00:00.157) 0:00:21.470 ********** 2025-03-23 21:52:57.454167 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:57.454350 | orchestrator | 2025-03-23 21:52:57.454888 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-03-23 21:52:57.454919 | orchestrator | Sunday 23 March 2025 21:52:57 +0000 (0:00:00.373) 0:00:21.844 ********** 2025-03-23 21:52:57.587381 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:57.588831 | orchestrator | 2025-03-23 21:52:57.588854 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-03-23 21:52:57.588872 | orchestrator | Sunday 23 March 2025 21:52:57 +0000 (0:00:00.133) 0:00:21.978 ********** 2025-03-23 21:52:57.777301 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:57.777732 | orchestrator | 2025-03-23 21:52:57.778531 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-03-23 21:52:57.778841 | orchestrator | Sunday 23 March 2025 21:52:57 +0000 (0:00:00.190) 0:00:22.168 ********** 2025-03-23 21:52:57.926353 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:57.927525 | orchestrator | 2025-03-23 21:52:57.928641 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-03-23 21:52:57.929117 | orchestrator | Sunday 23 March 2025 21:52:57 +0000 (0:00:00.147) 0:00:22.315 ********** 2025-03-23 21:52:58.124163 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:58.124609 | orchestrator | 2025-03-23 21:52:58.125046 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-03-23 21:52:58.125756 | orchestrator | Sunday 23 March 2025 21:52:58 +0000 (0:00:00.196) 0:00:22.512 ********** 2025-03-23 21:52:58.284523 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:58.285603 | orchestrator | 2025-03-23 21:52:58.286845 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-03-23 21:52:58.287327 | orchestrator | Sunday 23 March 2025 21:52:58 +0000 (0:00:00.161) 0:00:22.673 ********** 2025-03-23 21:52:58.424413 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:58.424552 | orchestrator | 2025-03-23 21:52:58.427402 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-03-23 21:52:58.572818 | orchestrator | Sunday 23 March 2025 21:52:58 +0000 (0:00:00.139) 0:00:22.812 ********** 2025-03-23 21:52:58.572888 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:58.573030 | orchestrator | 2025-03-23 21:52:58.573057 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-03-23 21:52:58.574512 | orchestrator | Sunday 23 March 2025 21:52:58 +0000 (0:00:00.149) 0:00:22.962 ********** 2025-03-23 21:52:58.725333 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:58.725770 | orchestrator | 2025-03-23 21:52:58.725808 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-03-23 21:52:58.727244 | orchestrator | Sunday 23 March 2025 21:52:58 +0000 (0:00:00.151) 0:00:23.114 ********** 2025-03-23 21:52:58.901835 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:58.902151 | orchestrator | 2025-03-23 21:52:58.903038 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-03-23 21:52:58.903385 | orchestrator | Sunday 23 March 2025 21:52:58 +0000 (0:00:00.178) 0:00:23.293 ********** 2025-03-23 21:52:59.049215 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:59.049381 | orchestrator | 2025-03-23 21:52:59.049625 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-03-23 21:52:59.050095 | orchestrator | Sunday 23 March 2025 21:52:59 +0000 (0:00:00.147) 0:00:23.440 ********** 2025-03-23 21:52:59.244944 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:59.245175 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:59.248655 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:59.249065 | orchestrator | 2025-03-23 21:52:59.249098 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-03-23 21:52:59.249198 | orchestrator | Sunday 23 March 2025 21:52:59 +0000 (0:00:00.193) 0:00:23.634 ********** 2025-03-23 21:52:59.697617 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:59.697926 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:59.698090 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:59.698482 | orchestrator | 2025-03-23 21:52:59.699478 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-03-23 21:52:59.702232 | orchestrator | Sunday 23 March 2025 21:52:59 +0000 (0:00:00.453) 0:00:24.087 ********** 2025-03-23 21:52:59.882971 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:52:59.883390 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:52:59.884793 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:52:59.887928 | orchestrator | 2025-03-23 21:52:59.887963 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-03-23 21:53:00.096518 | orchestrator | Sunday 23 March 2025 21:52:59 +0000 (0:00:00.184) 0:00:24.272 ********** 2025-03-23 21:53:00.096730 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:53:00.097199 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:53:00.097969 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:53:00.101595 | orchestrator | 2025-03-23 21:53:00.102723 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-03-23 21:53:00.102907 | orchestrator | Sunday 23 March 2025 21:53:00 +0000 (0:00:00.213) 0:00:24.485 ********** 2025-03-23 21:53:00.272073 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:53:00.272241 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:53:00.272268 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:53:00.272443 | orchestrator | 2025-03-23 21:53:00.272472 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-03-23 21:53:00.273068 | orchestrator | Sunday 23 March 2025 21:53:00 +0000 (0:00:00.176) 0:00:24.662 ********** 2025-03-23 21:53:00.455849 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:53:00.457004 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:53:00.457190 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:53:00.457642 | orchestrator | 2025-03-23 21:53:00.460928 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-03-23 21:53:00.647944 | orchestrator | Sunday 23 March 2025 21:53:00 +0000 (0:00:00.183) 0:00:24.845 ********** 2025-03-23 21:53:00.648050 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:53:00.648554 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:53:00.648659 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:53:00.648882 | orchestrator | 2025-03-23 21:53:00.649347 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-03-23 21:53:00.649618 | orchestrator | Sunday 23 March 2025 21:53:00 +0000 (0:00:00.192) 0:00:25.038 ********** 2025-03-23 21:53:00.842955 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:53:00.843386 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:53:00.844097 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:53:00.844824 | orchestrator | 2025-03-23 21:53:00.845688 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-03-23 21:53:00.846145 | orchestrator | Sunday 23 March 2025 21:53:00 +0000 (0:00:00.195) 0:00:25.234 ********** 2025-03-23 21:53:01.406075 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:53:01.406327 | orchestrator | 2025-03-23 21:53:01.406844 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-03-23 21:53:01.407143 | orchestrator | Sunday 23 March 2025 21:53:01 +0000 (0:00:00.561) 0:00:25.796 ********** 2025-03-23 21:53:01.943323 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:53:01.944063 | orchestrator | 2025-03-23 21:53:01.944100 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-03-23 21:53:01.944459 | orchestrator | Sunday 23 March 2025 21:53:01 +0000 (0:00:00.537) 0:00:26.333 ********** 2025-03-23 21:53:02.112035 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:53:02.112284 | orchestrator | 2025-03-23 21:53:02.113102 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-03-23 21:53:02.114086 | orchestrator | Sunday 23 March 2025 21:53:02 +0000 (0:00:00.168) 0:00:26.502 ********** 2025-03-23 21:53:02.337551 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'vg_name': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'}) 2025-03-23 21:53:02.338105 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'vg_name': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'}) 2025-03-23 21:53:02.340780 | orchestrator | 2025-03-23 21:53:02.341032 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-03-23 21:53:02.341062 | orchestrator | Sunday 23 March 2025 21:53:02 +0000 (0:00:00.224) 0:00:26.727 ********** 2025-03-23 21:53:02.817166 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:53:02.819926 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:53:02.821652 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:53:02.824452 | orchestrator | 2025-03-23 21:53:02.825260 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-03-23 21:53:02.826103 | orchestrator | Sunday 23 March 2025 21:53:02 +0000 (0:00:00.478) 0:00:27.205 ********** 2025-03-23 21:53:03.014518 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:53:03.015059 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:53:03.016994 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:53:03.017911 | orchestrator | 2025-03-23 21:53:03.018953 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-03-23 21:53:03.019736 | orchestrator | Sunday 23 March 2025 21:53:03 +0000 (0:00:00.194) 0:00:27.400 ********** 2025-03-23 21:53:03.202216 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef', 'data_vg': 'ceph-a8bbe70e-ef9b-5e78-a477-05274116adef'})  2025-03-23 21:53:03.203758 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428', 'data_vg': 'ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428'})  2025-03-23 21:53:03.206710 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:53:03.207196 | orchestrator | 2025-03-23 21:53:03.207243 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-03-23 21:53:03.207928 | orchestrator | Sunday 23 March 2025 21:53:03 +0000 (0:00:00.191) 0:00:27.592 ********** 2025-03-23 21:53:04.123614 | orchestrator | ok: [testbed-node-3] => { 2025-03-23 21:53:04.125418 | orchestrator |  "lvm_report": { 2025-03-23 21:53:04.125862 | orchestrator |  "lv": [ 2025-03-23 21:53:04.126286 | orchestrator |  { 2025-03-23 21:53:04.127830 | orchestrator |  "lv_name": "osd-block-7ba69cd0-48cc-55e3-9649-a43d5cfb8428", 2025-03-23 21:53:04.129373 | orchestrator |  "vg_name": "ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428" 2025-03-23 21:53:04.129404 | orchestrator |  }, 2025-03-23 21:53:04.129659 | orchestrator |  { 2025-03-23 21:53:04.129688 | orchestrator |  "lv_name": "osd-block-a8bbe70e-ef9b-5e78-a477-05274116adef", 2025-03-23 21:53:04.130734 | orchestrator |  "vg_name": "ceph-a8bbe70e-ef9b-5e78-a477-05274116adef" 2025-03-23 21:53:04.131355 | orchestrator |  } 2025-03-23 21:53:04.131383 | orchestrator |  ], 2025-03-23 21:53:04.132138 | orchestrator |  "pv": [ 2025-03-23 21:53:04.132421 | orchestrator |  { 2025-03-23 21:53:04.134349 | orchestrator |  "pv_name": "/dev/sdb", 2025-03-23 21:53:04.134729 | orchestrator |  "vg_name": "ceph-a8bbe70e-ef9b-5e78-a477-05274116adef" 2025-03-23 21:53:04.135215 | orchestrator |  }, 2025-03-23 21:53:04.135712 | orchestrator |  { 2025-03-23 21:53:04.136379 | orchestrator |  "pv_name": "/dev/sdc", 2025-03-23 21:53:04.136706 | orchestrator |  "vg_name": "ceph-7ba69cd0-48cc-55e3-9649-a43d5cfb8428" 2025-03-23 21:53:04.137461 | orchestrator |  } 2025-03-23 21:53:04.137862 | orchestrator |  ] 2025-03-23 21:53:04.138701 | orchestrator |  } 2025-03-23 21:53:04.139135 | orchestrator | } 2025-03-23 21:53:04.139467 | orchestrator | 2025-03-23 21:53:04.139672 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-03-23 21:53:04.140054 | orchestrator | 2025-03-23 21:53:04.140377 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-23 21:53:04.140673 | orchestrator | Sunday 23 March 2025 21:53:04 +0000 (0:00:00.920) 0:00:28.512 ********** 2025-03-23 21:53:04.398764 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-03-23 21:53:04.400317 | orchestrator | 2025-03-23 21:53:04.401067 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-23 21:53:04.401882 | orchestrator | Sunday 23 March 2025 21:53:04 +0000 (0:00:00.276) 0:00:28.789 ********** 2025-03-23 21:53:04.635850 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:53:04.636290 | orchestrator | 2025-03-23 21:53:04.636321 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:04.637310 | orchestrator | Sunday 23 March 2025 21:53:04 +0000 (0:00:00.236) 0:00:29.025 ********** 2025-03-23 21:53:05.120796 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2025-03-23 21:53:05.121678 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2025-03-23 21:53:05.122435 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2025-03-23 21:53:05.122735 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2025-03-23 21:53:05.123713 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2025-03-23 21:53:05.124704 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2025-03-23 21:53:05.125316 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2025-03-23 21:53:05.126476 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2025-03-23 21:53:05.127417 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2025-03-23 21:53:05.128190 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2025-03-23 21:53:05.128634 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2025-03-23 21:53:05.129194 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2025-03-23 21:53:05.129518 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2025-03-23 21:53:05.129896 | orchestrator | 2025-03-23 21:53:05.130360 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:05.130931 | orchestrator | Sunday 23 March 2025 21:53:05 +0000 (0:00:00.485) 0:00:29.510 ********** 2025-03-23 21:53:05.335226 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:05.335422 | orchestrator | 2025-03-23 21:53:05.335453 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:05.545244 | orchestrator | Sunday 23 March 2025 21:53:05 +0000 (0:00:00.215) 0:00:29.726 ********** 2025-03-23 21:53:05.545370 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:05.545655 | orchestrator | 2025-03-23 21:53:05.546117 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:05.548415 | orchestrator | Sunday 23 March 2025 21:53:05 +0000 (0:00:00.206) 0:00:29.932 ********** 2025-03-23 21:53:05.739334 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:05.740172 | orchestrator | 2025-03-23 21:53:05.741976 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:05.743063 | orchestrator | Sunday 23 March 2025 21:53:05 +0000 (0:00:00.196) 0:00:30.129 ********** 2025-03-23 21:53:05.938430 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:05.938816 | orchestrator | 2025-03-23 21:53:05.940842 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:06.154883 | orchestrator | Sunday 23 March 2025 21:53:05 +0000 (0:00:00.198) 0:00:30.328 ********** 2025-03-23 21:53:06.155032 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:06.155151 | orchestrator | 2025-03-23 21:53:06.156217 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:06.157159 | orchestrator | Sunday 23 March 2025 21:53:06 +0000 (0:00:00.216) 0:00:30.544 ********** 2025-03-23 21:53:06.364943 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:06.365185 | orchestrator | 2025-03-23 21:53:06.365828 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:06.366365 | orchestrator | Sunday 23 March 2025 21:53:06 +0000 (0:00:00.209) 0:00:30.754 ********** 2025-03-23 21:53:07.031843 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:07.033245 | orchestrator | 2025-03-23 21:53:07.034610 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:07.034692 | orchestrator | Sunday 23 March 2025 21:53:07 +0000 (0:00:00.666) 0:00:31.421 ********** 2025-03-23 21:53:07.269344 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:07.270429 | orchestrator | 2025-03-23 21:53:07.271650 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:07.273763 | orchestrator | Sunday 23 March 2025 21:53:07 +0000 (0:00:00.238) 0:00:31.659 ********** 2025-03-23 21:53:07.741526 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_4f3230c7-420c-46b1-a455-5c4b6f068dba) 2025-03-23 21:53:07.741822 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_4f3230c7-420c-46b1-a455-5c4b6f068dba) 2025-03-23 21:53:07.742204 | orchestrator | 2025-03-23 21:53:07.742651 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:07.743024 | orchestrator | Sunday 23 March 2025 21:53:07 +0000 (0:00:00.472) 0:00:32.131 ********** 2025-03-23 21:53:08.233431 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_ed33c331-9d6a-419f-a2ff-c44c346902af) 2025-03-23 21:53:08.233828 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_ed33c331-9d6a-419f-a2ff-c44c346902af) 2025-03-23 21:53:08.235114 | orchestrator | 2025-03-23 21:53:08.237792 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:08.911685 | orchestrator | Sunday 23 March 2025 21:53:08 +0000 (0:00:00.491) 0:00:32.623 ********** 2025-03-23 21:53:08.911805 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_50c8fc43-f5ea-4ff6-9b1c-3a82fc0b92db) 2025-03-23 21:53:08.912757 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_50c8fc43-f5ea-4ff6-9b1c-3a82fc0b92db) 2025-03-23 21:53:08.913847 | orchestrator | 2025-03-23 21:53:08.914913 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:08.915718 | orchestrator | Sunday 23 March 2025 21:53:08 +0000 (0:00:00.676) 0:00:33.300 ********** 2025-03-23 21:53:09.372625 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_f00ea6e7-d866-4241-9ba8-a6f4135a6384) 2025-03-23 21:53:09.374158 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_f00ea6e7-d866-4241-9ba8-a6f4135a6384) 2025-03-23 21:53:09.375056 | orchestrator | 2025-03-23 21:53:09.375829 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:09.376126 | orchestrator | Sunday 23 March 2025 21:53:09 +0000 (0:00:00.462) 0:00:33.762 ********** 2025-03-23 21:53:09.761618 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-23 21:53:09.762239 | orchestrator | 2025-03-23 21:53:09.762653 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:09.763461 | orchestrator | Sunday 23 March 2025 21:53:09 +0000 (0:00:00.384) 0:00:34.147 ********** 2025-03-23 21:53:10.276431 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2025-03-23 21:53:10.276928 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2025-03-23 21:53:10.278649 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2025-03-23 21:53:10.279472 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2025-03-23 21:53:10.280287 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2025-03-23 21:53:10.280974 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2025-03-23 21:53:10.281451 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2025-03-23 21:53:10.282736 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2025-03-23 21:53:10.283457 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2025-03-23 21:53:10.283985 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2025-03-23 21:53:10.285221 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2025-03-23 21:53:10.285660 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2025-03-23 21:53:10.286352 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2025-03-23 21:53:10.286805 | orchestrator | 2025-03-23 21:53:10.287219 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:10.287960 | orchestrator | Sunday 23 March 2025 21:53:10 +0000 (0:00:00.519) 0:00:34.666 ********** 2025-03-23 21:53:10.498812 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:10.498994 | orchestrator | 2025-03-23 21:53:10.499760 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:10.500498 | orchestrator | Sunday 23 March 2025 21:53:10 +0000 (0:00:00.222) 0:00:34.889 ********** 2025-03-23 21:53:11.145808 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:11.147417 | orchestrator | 2025-03-23 21:53:11.147458 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:11.355331 | orchestrator | Sunday 23 March 2025 21:53:11 +0000 (0:00:00.646) 0:00:35.535 ********** 2025-03-23 21:53:11.355470 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:11.355898 | orchestrator | 2025-03-23 21:53:11.357027 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:11.357807 | orchestrator | Sunday 23 March 2025 21:53:11 +0000 (0:00:00.208) 0:00:35.744 ********** 2025-03-23 21:53:11.585925 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:11.586503 | orchestrator | 2025-03-23 21:53:11.588339 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:11.589858 | orchestrator | Sunday 23 March 2025 21:53:11 +0000 (0:00:00.229) 0:00:35.973 ********** 2025-03-23 21:53:11.807918 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:11.808226 | orchestrator | 2025-03-23 21:53:11.809504 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:11.810200 | orchestrator | Sunday 23 March 2025 21:53:11 +0000 (0:00:00.223) 0:00:36.197 ********** 2025-03-23 21:53:12.077466 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:12.078951 | orchestrator | 2025-03-23 21:53:12.079028 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:12.079832 | orchestrator | Sunday 23 March 2025 21:53:12 +0000 (0:00:00.268) 0:00:36.466 ********** 2025-03-23 21:53:12.322591 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:12.323498 | orchestrator | 2025-03-23 21:53:12.324231 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:12.325750 | orchestrator | Sunday 23 March 2025 21:53:12 +0000 (0:00:00.246) 0:00:36.712 ********** 2025-03-23 21:53:12.546492 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:12.547765 | orchestrator | 2025-03-23 21:53:12.548144 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:12.549737 | orchestrator | Sunday 23 March 2025 21:53:12 +0000 (0:00:00.221) 0:00:36.934 ********** 2025-03-23 21:53:13.455302 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2025-03-23 21:53:13.458126 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2025-03-23 21:53:13.459162 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2025-03-23 21:53:13.460165 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2025-03-23 21:53:13.460862 | orchestrator | 2025-03-23 21:53:13.461734 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:13.681526 | orchestrator | Sunday 23 March 2025 21:53:13 +0000 (0:00:00.908) 0:00:37.843 ********** 2025-03-23 21:53:13.681683 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:13.681759 | orchestrator | 2025-03-23 21:53:13.682762 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:13.683435 | orchestrator | Sunday 23 March 2025 21:53:13 +0000 (0:00:00.227) 0:00:38.071 ********** 2025-03-23 21:53:13.889266 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:13.890317 | orchestrator | 2025-03-23 21:53:13.891628 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:13.893863 | orchestrator | Sunday 23 March 2025 21:53:13 +0000 (0:00:00.208) 0:00:38.279 ********** 2025-03-23 21:53:14.589199 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:14.590534 | orchestrator | 2025-03-23 21:53:14.592895 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:14.593250 | orchestrator | Sunday 23 March 2025 21:53:14 +0000 (0:00:00.699) 0:00:38.979 ********** 2025-03-23 21:53:14.812886 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:14.813647 | orchestrator | 2025-03-23 21:53:14.813678 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-03-23 21:53:14.814676 | orchestrator | Sunday 23 March 2025 21:53:14 +0000 (0:00:00.224) 0:00:39.203 ********** 2025-03-23 21:53:14.953465 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:14.954968 | orchestrator | 2025-03-23 21:53:14.954994 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-03-23 21:53:14.957193 | orchestrator | Sunday 23 March 2025 21:53:14 +0000 (0:00:00.140) 0:00:39.343 ********** 2025-03-23 21:53:15.172485 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'}}) 2025-03-23 21:53:15.173078 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '62e3fac0-87ec-50cc-8f44-41551697f65b'}}) 2025-03-23 21:53:15.173106 | orchestrator | 2025-03-23 21:53:15.173856 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-03-23 21:53:15.174100 | orchestrator | Sunday 23 March 2025 21:53:15 +0000 (0:00:00.220) 0:00:39.563 ********** 2025-03-23 21:53:17.314183 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'}) 2025-03-23 21:53:17.314301 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'}) 2025-03-23 21:53:17.314703 | orchestrator | 2025-03-23 21:53:17.317266 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-03-23 21:53:17.317452 | orchestrator | Sunday 23 March 2025 21:53:17 +0000 (0:00:02.139) 0:00:41.702 ********** 2025-03-23 21:53:17.483746 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:17.485423 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:17.486663 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:17.488912 | orchestrator | 2025-03-23 21:53:17.489143 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-03-23 21:53:17.489163 | orchestrator | Sunday 23 March 2025 21:53:17 +0000 (0:00:00.171) 0:00:41.874 ********** 2025-03-23 21:53:18.925038 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'}) 2025-03-23 21:53:18.925762 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'}) 2025-03-23 21:53:18.925805 | orchestrator | 2025-03-23 21:53:18.925877 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-03-23 21:53:18.926458 | orchestrator | Sunday 23 March 2025 21:53:18 +0000 (0:00:01.439) 0:00:43.314 ********** 2025-03-23 21:53:19.086548 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:19.087152 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:19.088181 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:19.089458 | orchestrator | 2025-03-23 21:53:19.090081 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-03-23 21:53:19.091141 | orchestrator | Sunday 23 March 2025 21:53:19 +0000 (0:00:00.162) 0:00:43.477 ********** 2025-03-23 21:53:19.239191 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:19.240361 | orchestrator | 2025-03-23 21:53:19.241132 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-03-23 21:53:19.241751 | orchestrator | Sunday 23 March 2025 21:53:19 +0000 (0:00:00.152) 0:00:43.629 ********** 2025-03-23 21:53:19.662349 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:19.663650 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:19.664279 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:19.665435 | orchestrator | 2025-03-23 21:53:19.667246 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-03-23 21:53:19.668239 | orchestrator | Sunday 23 March 2025 21:53:19 +0000 (0:00:00.421) 0:00:44.050 ********** 2025-03-23 21:53:19.818768 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:19.820229 | orchestrator | 2025-03-23 21:53:19.820544 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-03-23 21:53:19.820606 | orchestrator | Sunday 23 March 2025 21:53:19 +0000 (0:00:00.159) 0:00:44.210 ********** 2025-03-23 21:53:20.031273 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:20.031788 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:20.032381 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:20.033395 | orchestrator | 2025-03-23 21:53:20.034090 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-03-23 21:53:20.034919 | orchestrator | Sunday 23 March 2025 21:53:20 +0000 (0:00:00.210) 0:00:44.421 ********** 2025-03-23 21:53:20.182878 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:20.184441 | orchestrator | 2025-03-23 21:53:20.185473 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-03-23 21:53:20.186610 | orchestrator | Sunday 23 March 2025 21:53:20 +0000 (0:00:00.151) 0:00:44.572 ********** 2025-03-23 21:53:20.397624 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:20.398318 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:20.399314 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:20.400021 | orchestrator | 2025-03-23 21:53:20.401273 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-03-23 21:53:20.401736 | orchestrator | Sunday 23 March 2025 21:53:20 +0000 (0:00:00.212) 0:00:44.785 ********** 2025-03-23 21:53:20.563280 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:53:20.563384 | orchestrator | 2025-03-23 21:53:20.563409 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-03-23 21:53:20.563890 | orchestrator | Sunday 23 March 2025 21:53:20 +0000 (0:00:00.167) 0:00:44.952 ********** 2025-03-23 21:53:20.752221 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:20.753509 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:20.756615 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:20.958012 | orchestrator | 2025-03-23 21:53:20.958129 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-03-23 21:53:20.958146 | orchestrator | Sunday 23 March 2025 21:53:20 +0000 (0:00:00.189) 0:00:45.141 ********** 2025-03-23 21:53:20.958174 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:20.958656 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:20.958691 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:20.959197 | orchestrator | 2025-03-23 21:53:20.959665 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-03-23 21:53:20.960174 | orchestrator | Sunday 23 March 2025 21:53:20 +0000 (0:00:00.205) 0:00:45.347 ********** 2025-03-23 21:53:21.135723 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:21.136240 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:21.137432 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:21.138420 | orchestrator | 2025-03-23 21:53:21.138457 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-03-23 21:53:21.139342 | orchestrator | Sunday 23 March 2025 21:53:21 +0000 (0:00:00.178) 0:00:45.526 ********** 2025-03-23 21:53:21.288596 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:21.291154 | orchestrator | 2025-03-23 21:53:21.292173 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-03-23 21:53:21.292816 | orchestrator | Sunday 23 March 2025 21:53:21 +0000 (0:00:00.152) 0:00:45.678 ********** 2025-03-23 21:53:21.478307 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:21.478654 | orchestrator | 2025-03-23 21:53:21.479178 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-03-23 21:53:21.479384 | orchestrator | Sunday 23 March 2025 21:53:21 +0000 (0:00:00.188) 0:00:45.867 ********** 2025-03-23 21:53:21.868521 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:21.869304 | orchestrator | 2025-03-23 21:53:21.869933 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-03-23 21:53:21.869964 | orchestrator | Sunday 23 March 2025 21:53:21 +0000 (0:00:00.392) 0:00:46.259 ********** 2025-03-23 21:53:22.035071 | orchestrator | ok: [testbed-node-4] => { 2025-03-23 21:53:22.035779 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-03-23 21:53:22.037043 | orchestrator | } 2025-03-23 21:53:22.037323 | orchestrator | 2025-03-23 21:53:22.039634 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-03-23 21:53:22.182996 | orchestrator | Sunday 23 March 2025 21:53:22 +0000 (0:00:00.165) 0:00:46.425 ********** 2025-03-23 21:53:22.183096 | orchestrator | ok: [testbed-node-4] => { 2025-03-23 21:53:22.184428 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-03-23 21:53:22.186227 | orchestrator | } 2025-03-23 21:53:22.186658 | orchestrator | 2025-03-23 21:53:22.187735 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-03-23 21:53:22.188130 | orchestrator | Sunday 23 March 2025 21:53:22 +0000 (0:00:00.148) 0:00:46.573 ********** 2025-03-23 21:53:22.356729 | orchestrator | ok: [testbed-node-4] => { 2025-03-23 21:53:22.357251 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-03-23 21:53:22.358104 | orchestrator | } 2025-03-23 21:53:22.358675 | orchestrator | 2025-03-23 21:53:22.360733 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-03-23 21:53:22.360828 | orchestrator | Sunday 23 March 2025 21:53:22 +0000 (0:00:00.172) 0:00:46.746 ********** 2025-03-23 21:53:22.892353 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:53:22.892534 | orchestrator | 2025-03-23 21:53:22.893112 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-03-23 21:53:22.894394 | orchestrator | Sunday 23 March 2025 21:53:22 +0000 (0:00:00.534) 0:00:47.281 ********** 2025-03-23 21:53:23.436072 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:53:23.436710 | orchestrator | 2025-03-23 21:53:23.438058 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-03-23 21:53:23.955284 | orchestrator | Sunday 23 March 2025 21:53:23 +0000 (0:00:00.545) 0:00:47.826 ********** 2025-03-23 21:53:23.955397 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:53:23.955513 | orchestrator | 2025-03-23 21:53:23.956382 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-03-23 21:53:23.957377 | orchestrator | Sunday 23 March 2025 21:53:23 +0000 (0:00:00.518) 0:00:48.345 ********** 2025-03-23 21:53:24.107983 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:53:24.108907 | orchestrator | 2025-03-23 21:53:24.109707 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-03-23 21:53:24.110386 | orchestrator | Sunday 23 March 2025 21:53:24 +0000 (0:00:00.152) 0:00:48.497 ********** 2025-03-23 21:53:24.225142 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:24.226724 | orchestrator | 2025-03-23 21:53:24.228119 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-03-23 21:53:24.229722 | orchestrator | Sunday 23 March 2025 21:53:24 +0000 (0:00:00.117) 0:00:48.615 ********** 2025-03-23 21:53:24.373342 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:24.373444 | orchestrator | 2025-03-23 21:53:24.373466 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-03-23 21:53:24.373787 | orchestrator | Sunday 23 March 2025 21:53:24 +0000 (0:00:00.148) 0:00:48.763 ********** 2025-03-23 21:53:24.519709 | orchestrator | ok: [testbed-node-4] => { 2025-03-23 21:53:24.520606 | orchestrator |  "vgs_report": { 2025-03-23 21:53:24.523284 | orchestrator |  "vg": [] 2025-03-23 21:53:24.523990 | orchestrator |  } 2025-03-23 21:53:24.524017 | orchestrator | } 2025-03-23 21:53:24.524039 | orchestrator | 2025-03-23 21:53:24.524975 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-03-23 21:53:24.525893 | orchestrator | Sunday 23 March 2025 21:53:24 +0000 (0:00:00.144) 0:00:48.908 ********** 2025-03-23 21:53:24.885120 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:24.886082 | orchestrator | 2025-03-23 21:53:24.887115 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-03-23 21:53:24.887767 | orchestrator | Sunday 23 March 2025 21:53:24 +0000 (0:00:00.367) 0:00:49.275 ********** 2025-03-23 21:53:25.044483 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:25.044965 | orchestrator | 2025-03-23 21:53:25.045826 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-03-23 21:53:25.046702 | orchestrator | Sunday 23 March 2025 21:53:25 +0000 (0:00:00.158) 0:00:49.434 ********** 2025-03-23 21:53:25.215555 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:25.215823 | orchestrator | 2025-03-23 21:53:25.217734 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-03-23 21:53:25.218543 | orchestrator | Sunday 23 March 2025 21:53:25 +0000 (0:00:00.171) 0:00:49.606 ********** 2025-03-23 21:53:25.383779 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:25.383892 | orchestrator | 2025-03-23 21:53:25.384223 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-03-23 21:53:25.384467 | orchestrator | Sunday 23 March 2025 21:53:25 +0000 (0:00:00.169) 0:00:49.775 ********** 2025-03-23 21:53:25.525138 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:25.525756 | orchestrator | 2025-03-23 21:53:25.526658 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-03-23 21:53:25.528815 | orchestrator | Sunday 23 March 2025 21:53:25 +0000 (0:00:00.139) 0:00:49.914 ********** 2025-03-23 21:53:25.705922 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:25.707473 | orchestrator | 2025-03-23 21:53:25.708429 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-03-23 21:53:25.709459 | orchestrator | Sunday 23 March 2025 21:53:25 +0000 (0:00:00.179) 0:00:50.094 ********** 2025-03-23 21:53:25.853629 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:25.854267 | orchestrator | 2025-03-23 21:53:25.856968 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-03-23 21:53:25.858093 | orchestrator | Sunday 23 March 2025 21:53:25 +0000 (0:00:00.149) 0:00:50.243 ********** 2025-03-23 21:53:26.005917 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:26.006684 | orchestrator | 2025-03-23 21:53:26.007514 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-03-23 21:53:26.008129 | orchestrator | Sunday 23 March 2025 21:53:25 +0000 (0:00:00.151) 0:00:50.395 ********** 2025-03-23 21:53:26.164594 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:26.165097 | orchestrator | 2025-03-23 21:53:26.168890 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-03-23 21:53:26.320695 | orchestrator | Sunday 23 March 2025 21:53:26 +0000 (0:00:00.158) 0:00:50.553 ********** 2025-03-23 21:53:26.320790 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:26.320850 | orchestrator | 2025-03-23 21:53:26.321440 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-03-23 21:53:26.322210 | orchestrator | Sunday 23 March 2025 21:53:26 +0000 (0:00:00.155) 0:00:50.709 ********** 2025-03-23 21:53:26.471999 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:26.472407 | orchestrator | 2025-03-23 21:53:26.473457 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-03-23 21:53:26.473946 | orchestrator | Sunday 23 March 2025 21:53:26 +0000 (0:00:00.153) 0:00:50.862 ********** 2025-03-23 21:53:26.640768 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:26.641737 | orchestrator | 2025-03-23 21:53:26.642909 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-03-23 21:53:26.644082 | orchestrator | Sunday 23 March 2025 21:53:26 +0000 (0:00:00.165) 0:00:51.027 ********** 2025-03-23 21:53:27.021361 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:27.023032 | orchestrator | 2025-03-23 21:53:27.023936 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-03-23 21:53:27.025240 | orchestrator | Sunday 23 March 2025 21:53:27 +0000 (0:00:00.383) 0:00:51.411 ********** 2025-03-23 21:53:27.185116 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:27.186519 | orchestrator | 2025-03-23 21:53:27.189199 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-03-23 21:53:27.367698 | orchestrator | Sunday 23 March 2025 21:53:27 +0000 (0:00:00.161) 0:00:51.573 ********** 2025-03-23 21:53:27.367791 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:27.368459 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:27.369449 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:27.372094 | orchestrator | 2025-03-23 21:53:27.372426 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-03-23 21:53:27.372467 | orchestrator | Sunday 23 March 2025 21:53:27 +0000 (0:00:00.184) 0:00:51.757 ********** 2025-03-23 21:53:27.561689 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:27.562409 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:27.563130 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:27.563284 | orchestrator | 2025-03-23 21:53:27.564209 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-03-23 21:53:27.567641 | orchestrator | Sunday 23 March 2025 21:53:27 +0000 (0:00:00.194) 0:00:51.951 ********** 2025-03-23 21:53:27.748333 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:27.748758 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:27.749921 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:27.750138 | orchestrator | 2025-03-23 21:53:27.750733 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-03-23 21:53:27.753455 | orchestrator | Sunday 23 March 2025 21:53:27 +0000 (0:00:00.186) 0:00:52.138 ********** 2025-03-23 21:53:27.936228 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:27.936839 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:27.940480 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:27.940947 | orchestrator | 2025-03-23 21:53:27.940978 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-03-23 21:53:27.940999 | orchestrator | Sunday 23 March 2025 21:53:27 +0000 (0:00:00.186) 0:00:52.325 ********** 2025-03-23 21:53:28.126452 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:28.127012 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:28.128139 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:28.129436 | orchestrator | 2025-03-23 21:53:28.129935 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-03-23 21:53:28.130763 | orchestrator | Sunday 23 March 2025 21:53:28 +0000 (0:00:00.191) 0:00:52.516 ********** 2025-03-23 21:53:28.302008 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:28.303426 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:28.303521 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:28.303809 | orchestrator | 2025-03-23 21:53:28.305249 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-03-23 21:53:28.306756 | orchestrator | Sunday 23 March 2025 21:53:28 +0000 (0:00:00.175) 0:00:52.692 ********** 2025-03-23 21:53:28.488646 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:28.489628 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:28.490118 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:28.491585 | orchestrator | 2025-03-23 21:53:28.493170 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-03-23 21:53:28.494115 | orchestrator | Sunday 23 March 2025 21:53:28 +0000 (0:00:00.186) 0:00:52.878 ********** 2025-03-23 21:53:28.657973 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:28.659111 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:28.659594 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:28.662114 | orchestrator | 2025-03-23 21:53:29.205623 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-03-23 21:53:29.205736 | orchestrator | Sunday 23 March 2025 21:53:28 +0000 (0:00:00.169) 0:00:53.047 ********** 2025-03-23 21:53:29.205772 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:53:29.205854 | orchestrator | 2025-03-23 21:53:29.207774 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-03-23 21:53:29.208039 | orchestrator | Sunday 23 March 2025 21:53:29 +0000 (0:00:00.538) 0:00:53.586 ********** 2025-03-23 21:53:29.866770 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:53:29.869140 | orchestrator | 2025-03-23 21:53:30.012667 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-03-23 21:53:30.012789 | orchestrator | Sunday 23 March 2025 21:53:29 +0000 (0:00:00.668) 0:00:54.254 ********** 2025-03-23 21:53:30.012824 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:53:30.013713 | orchestrator | 2025-03-23 21:53:30.015329 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-03-23 21:53:30.016232 | orchestrator | Sunday 23 March 2025 21:53:30 +0000 (0:00:00.146) 0:00:54.401 ********** 2025-03-23 21:53:30.247372 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'vg_name': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'}) 2025-03-23 21:53:30.248695 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'vg_name': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'}) 2025-03-23 21:53:30.249194 | orchestrator | 2025-03-23 21:53:30.250205 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-03-23 21:53:30.253003 | orchestrator | Sunday 23 March 2025 21:53:30 +0000 (0:00:00.237) 0:00:54.638 ********** 2025-03-23 21:53:30.443491 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:30.444479 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:30.444522 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:30.445162 | orchestrator | 2025-03-23 21:53:30.445677 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-03-23 21:53:30.447640 | orchestrator | Sunday 23 March 2025 21:53:30 +0000 (0:00:00.194) 0:00:54.833 ********** 2025-03-23 21:53:30.643811 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:30.644548 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:30.644880 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:30.647977 | orchestrator | 2025-03-23 21:53:30.649339 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-03-23 21:53:30.650262 | orchestrator | Sunday 23 March 2025 21:53:30 +0000 (0:00:00.199) 0:00:55.032 ********** 2025-03-23 21:53:30.828381 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3', 'data_vg': 'ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3'})  2025-03-23 21:53:30.829695 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b', 'data_vg': 'ceph-62e3fac0-87ec-50cc-8f44-41551697f65b'})  2025-03-23 21:53:30.830764 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:53:30.833644 | orchestrator | 2025-03-23 21:53:30.834579 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-03-23 21:53:30.835357 | orchestrator | Sunday 23 March 2025 21:53:30 +0000 (0:00:00.185) 0:00:55.218 ********** 2025-03-23 21:53:31.783334 | orchestrator | ok: [testbed-node-4] => { 2025-03-23 21:53:31.783703 | orchestrator |  "lvm_report": { 2025-03-23 21:53:31.789740 | orchestrator |  "lv": [ 2025-03-23 21:53:31.790167 | orchestrator |  { 2025-03-23 21:53:31.790655 | orchestrator |  "lv_name": "osd-block-62e3fac0-87ec-50cc-8f44-41551697f65b", 2025-03-23 21:53:31.790682 | orchestrator |  "vg_name": "ceph-62e3fac0-87ec-50cc-8f44-41551697f65b" 2025-03-23 21:53:31.790697 | orchestrator |  }, 2025-03-23 21:53:31.790712 | orchestrator |  { 2025-03-23 21:53:31.790726 | orchestrator |  "lv_name": "osd-block-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3", 2025-03-23 21:53:31.790765 | orchestrator |  "vg_name": "ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3" 2025-03-23 21:53:31.790780 | orchestrator |  } 2025-03-23 21:53:31.790794 | orchestrator |  ], 2025-03-23 21:53:31.790808 | orchestrator |  "pv": [ 2025-03-23 21:53:31.790859 | orchestrator |  { 2025-03-23 21:53:31.790921 | orchestrator |  "pv_name": "/dev/sdb", 2025-03-23 21:53:31.791190 | orchestrator |  "vg_name": "ceph-e8e09cee-3c3a-56a6-a7d5-937e6639e1b3" 2025-03-23 21:53:31.792211 | orchestrator |  }, 2025-03-23 21:53:31.792276 | orchestrator |  { 2025-03-23 21:53:31.792699 | orchestrator |  "pv_name": "/dev/sdc", 2025-03-23 21:53:31.792989 | orchestrator |  "vg_name": "ceph-62e3fac0-87ec-50cc-8f44-41551697f65b" 2025-03-23 21:53:31.793344 | orchestrator |  } 2025-03-23 21:53:31.793813 | orchestrator |  ] 2025-03-23 21:53:31.794098 | orchestrator |  } 2025-03-23 21:53:31.794473 | orchestrator | } 2025-03-23 21:53:31.794794 | orchestrator | 2025-03-23 21:53:31.795447 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-03-23 21:53:31.795689 | orchestrator | 2025-03-23 21:53:31.796335 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-23 21:53:31.796754 | orchestrator | Sunday 23 March 2025 21:53:31 +0000 (0:00:00.953) 0:00:56.171 ********** 2025-03-23 21:53:32.079988 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-03-23 21:53:32.080094 | orchestrator | 2025-03-23 21:53:32.080473 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-23 21:53:32.080995 | orchestrator | Sunday 23 March 2025 21:53:32 +0000 (0:00:00.299) 0:00:56.471 ********** 2025-03-23 21:53:32.372255 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:53:32.373702 | orchestrator | 2025-03-23 21:53:32.374336 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:32.375419 | orchestrator | Sunday 23 March 2025 21:53:32 +0000 (0:00:00.292) 0:00:56.763 ********** 2025-03-23 21:53:32.917160 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2025-03-23 21:53:32.919086 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2025-03-23 21:53:32.920861 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2025-03-23 21:53:32.921530 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2025-03-23 21:53:32.922598 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2025-03-23 21:53:32.924396 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2025-03-23 21:53:32.925276 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2025-03-23 21:53:32.925837 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2025-03-23 21:53:32.926286 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2025-03-23 21:53:32.927051 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2025-03-23 21:53:32.928019 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2025-03-23 21:53:32.928341 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2025-03-23 21:53:32.928765 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2025-03-23 21:53:32.929544 | orchestrator | 2025-03-23 21:53:32.930068 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:32.930138 | orchestrator | Sunday 23 March 2025 21:53:32 +0000 (0:00:00.542) 0:00:57.305 ********** 2025-03-23 21:53:33.162977 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:33.163487 | orchestrator | 2025-03-23 21:53:33.164099 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:33.164374 | orchestrator | Sunday 23 March 2025 21:53:33 +0000 (0:00:00.248) 0:00:57.553 ********** 2025-03-23 21:53:33.393200 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:33.394819 | orchestrator | 2025-03-23 21:53:33.398900 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:33.620523 | orchestrator | Sunday 23 March 2025 21:53:33 +0000 (0:00:00.229) 0:00:57.783 ********** 2025-03-23 21:53:33.620630 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:33.620738 | orchestrator | 2025-03-23 21:53:33.622780 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:33.623023 | orchestrator | Sunday 23 March 2025 21:53:33 +0000 (0:00:00.228) 0:00:58.011 ********** 2025-03-23 21:53:34.256460 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:34.257276 | orchestrator | 2025-03-23 21:53:34.257329 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:34.257717 | orchestrator | Sunday 23 March 2025 21:53:34 +0000 (0:00:00.635) 0:00:58.647 ********** 2025-03-23 21:53:34.485508 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:34.486192 | orchestrator | 2025-03-23 21:53:34.488231 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:34.490549 | orchestrator | Sunday 23 March 2025 21:53:34 +0000 (0:00:00.228) 0:00:58.876 ********** 2025-03-23 21:53:34.738784 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:34.738889 | orchestrator | 2025-03-23 21:53:34.738903 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:34.739613 | orchestrator | Sunday 23 March 2025 21:53:34 +0000 (0:00:00.252) 0:00:59.128 ********** 2025-03-23 21:53:34.958242 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:34.958432 | orchestrator | 2025-03-23 21:53:34.958933 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:34.962172 | orchestrator | Sunday 23 March 2025 21:53:34 +0000 (0:00:00.220) 0:00:59.349 ********** 2025-03-23 21:53:35.188786 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:35.189584 | orchestrator | 2025-03-23 21:53:35.190458 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:35.191666 | orchestrator | Sunday 23 March 2025 21:53:35 +0000 (0:00:00.230) 0:00:59.579 ********** 2025-03-23 21:53:35.661876 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_2c3245a0-bb3b-48a6-962d-1b5b9b49262d) 2025-03-23 21:53:35.662066 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_2c3245a0-bb3b-48a6-962d-1b5b9b49262d) 2025-03-23 21:53:35.662093 | orchestrator | 2025-03-23 21:53:35.662955 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:35.663140 | orchestrator | Sunday 23 March 2025 21:53:35 +0000 (0:00:00.472) 0:01:00.052 ********** 2025-03-23 21:53:36.186945 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_1aae855b-0878-455b-beee-c51e17b854da) 2025-03-23 21:53:36.187782 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_1aae855b-0878-455b-beee-c51e17b854da) 2025-03-23 21:53:36.189964 | orchestrator | 2025-03-23 21:53:36.689156 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:36.689257 | orchestrator | Sunday 23 March 2025 21:53:36 +0000 (0:00:00.522) 0:01:00.574 ********** 2025-03-23 21:53:36.689286 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_4ae64758-af7a-4ee3-835e-8ab2b9979c52) 2025-03-23 21:53:36.692190 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_4ae64758-af7a-4ee3-835e-8ab2b9979c52) 2025-03-23 21:53:36.694370 | orchestrator | 2025-03-23 21:53:36.694502 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:36.695295 | orchestrator | Sunday 23 March 2025 21:53:36 +0000 (0:00:00.505) 0:01:01.079 ********** 2025-03-23 21:53:37.406287 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_558eae64-1e92-4eba-9b7a-c9f2592aca3c) 2025-03-23 21:53:37.406764 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_558eae64-1e92-4eba-9b7a-c9f2592aca3c) 2025-03-23 21:53:37.407877 | orchestrator | 2025-03-23 21:53:37.409285 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-23 21:53:37.411778 | orchestrator | Sunday 23 March 2025 21:53:37 +0000 (0:00:00.715) 0:01:01.795 ********** 2025-03-23 21:53:38.297278 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-23 21:53:38.297455 | orchestrator | 2025-03-23 21:53:38.297864 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:38.298140 | orchestrator | Sunday 23 March 2025 21:53:38 +0000 (0:00:00.885) 0:01:02.681 ********** 2025-03-23 21:53:38.862066 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2025-03-23 21:53:38.862715 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2025-03-23 21:53:38.863257 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2025-03-23 21:53:38.863743 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2025-03-23 21:53:38.864981 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2025-03-23 21:53:38.865196 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2025-03-23 21:53:38.865686 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2025-03-23 21:53:38.867842 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2025-03-23 21:53:38.868241 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2025-03-23 21:53:38.868268 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2025-03-23 21:53:38.868284 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2025-03-23 21:53:38.868304 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2025-03-23 21:53:38.869374 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2025-03-23 21:53:38.869895 | orchestrator | 2025-03-23 21:53:38.870248 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:38.870615 | orchestrator | Sunday 23 March 2025 21:53:38 +0000 (0:00:00.568) 0:01:03.250 ********** 2025-03-23 21:53:39.106109 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:39.106314 | orchestrator | 2025-03-23 21:53:39.106406 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:39.106744 | orchestrator | Sunday 23 March 2025 21:53:39 +0000 (0:00:00.246) 0:01:03.497 ********** 2025-03-23 21:53:39.311116 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:39.311720 | orchestrator | 2025-03-23 21:53:39.311796 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:39.313303 | orchestrator | Sunday 23 March 2025 21:53:39 +0000 (0:00:00.204) 0:01:03.702 ********** 2025-03-23 21:53:39.522952 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:39.523606 | orchestrator | 2025-03-23 21:53:39.524267 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:39.525005 | orchestrator | Sunday 23 March 2025 21:53:39 +0000 (0:00:00.211) 0:01:03.913 ********** 2025-03-23 21:53:39.760178 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:39.760816 | orchestrator | 2025-03-23 21:53:39.761699 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:39.762161 | orchestrator | Sunday 23 March 2025 21:53:39 +0000 (0:00:00.235) 0:01:04.149 ********** 2025-03-23 21:53:39.966302 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:39.966410 | orchestrator | 2025-03-23 21:53:39.968659 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:39.969758 | orchestrator | Sunday 23 March 2025 21:53:39 +0000 (0:00:00.207) 0:01:04.357 ********** 2025-03-23 21:53:40.219687 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:40.219915 | orchestrator | 2025-03-23 21:53:40.220855 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:40.220891 | orchestrator | Sunday 23 March 2025 21:53:40 +0000 (0:00:00.252) 0:01:04.610 ********** 2025-03-23 21:53:40.441729 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:40.442394 | orchestrator | 2025-03-23 21:53:40.442436 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:40.442537 | orchestrator | Sunday 23 March 2025 21:53:40 +0000 (0:00:00.221) 0:01:04.832 ********** 2025-03-23 21:53:40.826529 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:40.826786 | orchestrator | 2025-03-23 21:53:40.828667 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:40.829252 | orchestrator | Sunday 23 March 2025 21:53:40 +0000 (0:00:00.381) 0:01:05.214 ********** 2025-03-23 21:53:42.117124 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2025-03-23 21:53:42.117272 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2025-03-23 21:53:42.119666 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2025-03-23 21:53:42.119969 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2025-03-23 21:53:42.120251 | orchestrator | 2025-03-23 21:53:42.120682 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:42.121110 | orchestrator | Sunday 23 March 2025 21:53:42 +0000 (0:00:01.290) 0:01:06.505 ********** 2025-03-23 21:53:42.330210 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:42.330481 | orchestrator | 2025-03-23 21:53:42.331636 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:42.331879 | orchestrator | Sunday 23 March 2025 21:53:42 +0000 (0:00:00.216) 0:01:06.721 ********** 2025-03-23 21:53:42.551370 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:42.551724 | orchestrator | 2025-03-23 21:53:42.553083 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:42.558406 | orchestrator | Sunday 23 March 2025 21:53:42 +0000 (0:00:00.220) 0:01:06.941 ********** 2025-03-23 21:53:42.779058 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:42.779609 | orchestrator | 2025-03-23 21:53:42.779966 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-23 21:53:42.780705 | orchestrator | Sunday 23 March 2025 21:53:42 +0000 (0:00:00.228) 0:01:07.169 ********** 2025-03-23 21:53:43.004476 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:43.007764 | orchestrator | 2025-03-23 21:53:43.008678 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-03-23 21:53:43.009670 | orchestrator | Sunday 23 March 2025 21:53:42 +0000 (0:00:00.223) 0:01:07.393 ********** 2025-03-23 21:53:43.149909 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:43.150309 | orchestrator | 2025-03-23 21:53:43.150395 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-03-23 21:53:43.151144 | orchestrator | Sunday 23 March 2025 21:53:43 +0000 (0:00:00.146) 0:01:07.540 ********** 2025-03-23 21:53:43.379083 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'}}) 2025-03-23 21:53:43.380371 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a4d93e20-7e7b-5457-96cd-ba2e435e9438'}}) 2025-03-23 21:53:43.381157 | orchestrator | 2025-03-23 21:53:43.381207 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-03-23 21:53:43.381422 | orchestrator | Sunday 23 March 2025 21:53:43 +0000 (0:00:00.228) 0:01:07.769 ********** 2025-03-23 21:53:45.597777 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'}) 2025-03-23 21:53:45.599687 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'}) 2025-03-23 21:53:45.600806 | orchestrator | 2025-03-23 21:53:45.601113 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-03-23 21:53:45.603086 | orchestrator | Sunday 23 March 2025 21:53:45 +0000 (0:00:02.216) 0:01:09.986 ********** 2025-03-23 21:53:45.784398 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:45.785818 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:45.785863 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:45.787468 | orchestrator | 2025-03-23 21:53:45.788168 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-03-23 21:53:45.789653 | orchestrator | Sunday 23 March 2025 21:53:45 +0000 (0:00:00.186) 0:01:10.173 ********** 2025-03-23 21:53:47.254292 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'}) 2025-03-23 21:53:47.254458 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'}) 2025-03-23 21:53:47.254881 | orchestrator | 2025-03-23 21:53:47.255838 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-03-23 21:53:47.256647 | orchestrator | Sunday 23 March 2025 21:53:47 +0000 (0:00:01.470) 0:01:11.643 ********** 2025-03-23 21:53:47.441890 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:47.442607 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:47.442648 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:47.443423 | orchestrator | 2025-03-23 21:53:47.444249 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-03-23 21:53:47.444700 | orchestrator | Sunday 23 March 2025 21:53:47 +0000 (0:00:00.186) 0:01:11.830 ********** 2025-03-23 21:53:47.606923 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:47.608234 | orchestrator | 2025-03-23 21:53:47.609213 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-03-23 21:53:47.610120 | orchestrator | Sunday 23 March 2025 21:53:47 +0000 (0:00:00.166) 0:01:11.996 ********** 2025-03-23 21:53:47.805204 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:47.808894 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:47.808966 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:47.810734 | orchestrator | 2025-03-23 21:53:47.811206 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-03-23 21:53:47.811239 | orchestrator | Sunday 23 March 2025 21:53:47 +0000 (0:00:00.193) 0:01:12.190 ********** 2025-03-23 21:53:47.961155 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:47.962709 | orchestrator | 2025-03-23 21:53:47.964511 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-03-23 21:53:47.967195 | orchestrator | Sunday 23 March 2025 21:53:47 +0000 (0:00:00.160) 0:01:12.350 ********** 2025-03-23 21:53:48.156172 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:48.157977 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:48.158866 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:48.159719 | orchestrator | 2025-03-23 21:53:48.160701 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-03-23 21:53:48.161469 | orchestrator | Sunday 23 March 2025 21:53:48 +0000 (0:00:00.194) 0:01:12.545 ********** 2025-03-23 21:53:48.348729 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:48.349421 | orchestrator | 2025-03-23 21:53:48.350548 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-03-23 21:53:48.351701 | orchestrator | Sunday 23 March 2025 21:53:48 +0000 (0:00:00.193) 0:01:12.739 ********** 2025-03-23 21:53:48.528740 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:48.529874 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:48.531299 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:48.532405 | orchestrator | 2025-03-23 21:53:48.533150 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-03-23 21:53:48.534111 | orchestrator | Sunday 23 March 2025 21:53:48 +0000 (0:00:00.179) 0:01:12.919 ********** 2025-03-23 21:53:48.675972 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:53:48.676116 | orchestrator | 2025-03-23 21:53:48.676776 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-03-23 21:53:48.677224 | orchestrator | Sunday 23 March 2025 21:53:48 +0000 (0:00:00.147) 0:01:13.066 ********** 2025-03-23 21:53:48.861815 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:48.862717 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:48.865356 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:48.865445 | orchestrator | 2025-03-23 21:53:48.865466 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-03-23 21:53:48.865510 | orchestrator | Sunday 23 March 2025 21:53:48 +0000 (0:00:00.184) 0:01:13.251 ********** 2025-03-23 21:53:49.049557 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:49.050860 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:49.050890 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:49.050912 | orchestrator | 2025-03-23 21:53:49.051007 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-03-23 21:53:49.051093 | orchestrator | Sunday 23 March 2025 21:53:49 +0000 (0:00:00.188) 0:01:13.439 ********** 2025-03-23 21:53:49.443463 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:49.443653 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:49.444359 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:49.445181 | orchestrator | 2025-03-23 21:53:49.445626 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-03-23 21:53:49.446091 | orchestrator | Sunday 23 March 2025 21:53:49 +0000 (0:00:00.391) 0:01:13.830 ********** 2025-03-23 21:53:49.597086 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:49.597350 | orchestrator | 2025-03-23 21:53:49.598156 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-03-23 21:53:49.598979 | orchestrator | Sunday 23 March 2025 21:53:49 +0000 (0:00:00.157) 0:01:13.987 ********** 2025-03-23 21:53:49.767518 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:49.768525 | orchestrator | 2025-03-23 21:53:49.769297 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-03-23 21:53:49.769717 | orchestrator | Sunday 23 March 2025 21:53:49 +0000 (0:00:00.168) 0:01:14.156 ********** 2025-03-23 21:53:49.913976 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:49.914316 | orchestrator | 2025-03-23 21:53:49.914351 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-03-23 21:53:49.914716 | orchestrator | Sunday 23 March 2025 21:53:49 +0000 (0:00:00.147) 0:01:14.304 ********** 2025-03-23 21:53:50.075917 | orchestrator | ok: [testbed-node-5] => { 2025-03-23 21:53:50.077166 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-03-23 21:53:50.077256 | orchestrator | } 2025-03-23 21:53:50.077885 | orchestrator | 2025-03-23 21:53:50.080656 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-03-23 21:53:50.251013 | orchestrator | Sunday 23 March 2025 21:53:50 +0000 (0:00:00.162) 0:01:14.466 ********** 2025-03-23 21:53:50.251094 | orchestrator | ok: [testbed-node-5] => { 2025-03-23 21:53:50.251355 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-03-23 21:53:50.253860 | orchestrator | } 2025-03-23 21:53:50.254988 | orchestrator | 2025-03-23 21:53:50.255326 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-03-23 21:53:50.255923 | orchestrator | Sunday 23 March 2025 21:53:50 +0000 (0:00:00.172) 0:01:14.639 ********** 2025-03-23 21:53:50.399642 | orchestrator | ok: [testbed-node-5] => { 2025-03-23 21:53:50.400978 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-03-23 21:53:50.403334 | orchestrator | } 2025-03-23 21:53:50.404763 | orchestrator | 2025-03-23 21:53:50.407439 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-03-23 21:53:50.409287 | orchestrator | Sunday 23 March 2025 21:53:50 +0000 (0:00:00.150) 0:01:14.789 ********** 2025-03-23 21:53:50.959441 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:53:50.959695 | orchestrator | 2025-03-23 21:53:50.960792 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-03-23 21:53:50.962371 | orchestrator | Sunday 23 March 2025 21:53:50 +0000 (0:00:00.557) 0:01:15.346 ********** 2025-03-23 21:53:51.498709 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:53:51.499673 | orchestrator | 2025-03-23 21:53:51.500376 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-03-23 21:53:51.503247 | orchestrator | Sunday 23 March 2025 21:53:51 +0000 (0:00:00.541) 0:01:15.888 ********** 2025-03-23 21:53:52.053361 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:53:52.053958 | orchestrator | 2025-03-23 21:53:52.054002 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-03-23 21:53:52.054979 | orchestrator | Sunday 23 March 2025 21:53:52 +0000 (0:00:00.553) 0:01:16.441 ********** 2025-03-23 21:53:52.192424 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:53:52.193698 | orchestrator | 2025-03-23 21:53:52.194369 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-03-23 21:53:52.198093 | orchestrator | Sunday 23 March 2025 21:53:52 +0000 (0:00:00.140) 0:01:16.582 ********** 2025-03-23 21:53:52.534819 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:52.535007 | orchestrator | 2025-03-23 21:53:52.535252 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-03-23 21:53:52.536168 | orchestrator | Sunday 23 March 2025 21:53:52 +0000 (0:00:00.343) 0:01:16.926 ********** 2025-03-23 21:53:52.655744 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:52.655888 | orchestrator | 2025-03-23 21:53:52.656601 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-03-23 21:53:52.657059 | orchestrator | Sunday 23 March 2025 21:53:52 +0000 (0:00:00.121) 0:01:17.047 ********** 2025-03-23 21:53:52.802730 | orchestrator | ok: [testbed-node-5] => { 2025-03-23 21:53:52.806658 | orchestrator |  "vgs_report": { 2025-03-23 21:53:52.808279 | orchestrator |  "vg": [] 2025-03-23 21:53:52.808461 | orchestrator |  } 2025-03-23 21:53:52.809283 | orchestrator | } 2025-03-23 21:53:52.810118 | orchestrator | 2025-03-23 21:53:52.810699 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-03-23 21:53:52.812629 | orchestrator | Sunday 23 March 2025 21:53:52 +0000 (0:00:00.145) 0:01:17.192 ********** 2025-03-23 21:53:52.946322 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:52.946994 | orchestrator | 2025-03-23 21:53:52.947338 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-03-23 21:53:52.948314 | orchestrator | Sunday 23 March 2025 21:53:52 +0000 (0:00:00.145) 0:01:17.337 ********** 2025-03-23 21:53:53.087108 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:53.087976 | orchestrator | 2025-03-23 21:53:53.089427 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-03-23 21:53:53.090432 | orchestrator | Sunday 23 March 2025 21:53:53 +0000 (0:00:00.139) 0:01:17.477 ********** 2025-03-23 21:53:53.246172 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:53.246860 | orchestrator | 2025-03-23 21:53:53.248692 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-03-23 21:53:53.251203 | orchestrator | Sunday 23 March 2025 21:53:53 +0000 (0:00:00.151) 0:01:17.628 ********** 2025-03-23 21:53:53.402744 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:53.402929 | orchestrator | 2025-03-23 21:53:53.404011 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-03-23 21:53:53.404478 | orchestrator | Sunday 23 March 2025 21:53:53 +0000 (0:00:00.164) 0:01:17.793 ********** 2025-03-23 21:53:53.545260 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:53.546058 | orchestrator | 2025-03-23 21:53:53.547301 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-03-23 21:53:53.548209 | orchestrator | Sunday 23 March 2025 21:53:53 +0000 (0:00:00.141) 0:01:17.935 ********** 2025-03-23 21:53:53.687436 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:53.687683 | orchestrator | 2025-03-23 21:53:53.687721 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-03-23 21:53:53.688338 | orchestrator | Sunday 23 March 2025 21:53:53 +0000 (0:00:00.141) 0:01:18.076 ********** 2025-03-23 21:53:53.821134 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:53.823722 | orchestrator | 2025-03-23 21:53:54.017926 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-03-23 21:53:54.018088 | orchestrator | Sunday 23 March 2025 21:53:53 +0000 (0:00:00.135) 0:01:18.212 ********** 2025-03-23 21:53:54.018120 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:54.019219 | orchestrator | 2025-03-23 21:53:54.020268 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-03-23 21:53:54.028801 | orchestrator | Sunday 23 March 2025 21:53:54 +0000 (0:00:00.192) 0:01:18.404 ********** 2025-03-23 21:53:54.167547 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:54.167890 | orchestrator | 2025-03-23 21:53:54.168676 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-03-23 21:53:54.169103 | orchestrator | Sunday 23 March 2025 21:53:54 +0000 (0:00:00.153) 0:01:18.557 ********** 2025-03-23 21:53:54.580991 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:54.581342 | orchestrator | 2025-03-23 21:53:54.582436 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-03-23 21:53:54.583466 | orchestrator | Sunday 23 March 2025 21:53:54 +0000 (0:00:00.414) 0:01:18.972 ********** 2025-03-23 21:53:54.735087 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:54.736586 | orchestrator | 2025-03-23 21:53:54.737344 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-03-23 21:53:54.738172 | orchestrator | Sunday 23 March 2025 21:53:54 +0000 (0:00:00.153) 0:01:19.125 ********** 2025-03-23 21:53:54.901050 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:54.901539 | orchestrator | 2025-03-23 21:53:54.904914 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-03-23 21:53:55.067273 | orchestrator | Sunday 23 March 2025 21:53:54 +0000 (0:00:00.164) 0:01:19.290 ********** 2025-03-23 21:53:55.067346 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:55.067520 | orchestrator | 2025-03-23 21:53:55.067545 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-03-23 21:53:55.067605 | orchestrator | Sunday 23 March 2025 21:53:55 +0000 (0:00:00.167) 0:01:19.457 ********** 2025-03-23 21:53:55.224330 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:55.225108 | orchestrator | 2025-03-23 21:53:55.226586 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-03-23 21:53:55.227425 | orchestrator | Sunday 23 March 2025 21:53:55 +0000 (0:00:00.155) 0:01:19.613 ********** 2025-03-23 21:53:55.504002 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:55.504457 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:55.504489 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:55.505119 | orchestrator | 2025-03-23 21:53:55.505386 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-03-23 21:53:55.505414 | orchestrator | Sunday 23 March 2025 21:53:55 +0000 (0:00:00.281) 0:01:19.895 ********** 2025-03-23 21:53:55.690734 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:55.691276 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:55.692513 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:55.693291 | orchestrator | 2025-03-23 21:53:55.695146 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-03-23 21:53:55.696763 | orchestrator | Sunday 23 March 2025 21:53:55 +0000 (0:00:00.186) 0:01:20.081 ********** 2025-03-23 21:53:55.871814 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:55.874088 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:55.875167 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:55.875743 | orchestrator | 2025-03-23 21:53:55.877076 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-03-23 21:53:55.877715 | orchestrator | Sunday 23 March 2025 21:53:55 +0000 (0:00:00.180) 0:01:20.262 ********** 2025-03-23 21:53:56.086113 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:56.086973 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:56.087728 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:56.088628 | orchestrator | 2025-03-23 21:53:56.089171 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-03-23 21:53:56.090130 | orchestrator | Sunday 23 March 2025 21:53:56 +0000 (0:00:00.213) 0:01:20.475 ********** 2025-03-23 21:53:56.268212 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:56.268772 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:56.269411 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:56.270007 | orchestrator | 2025-03-23 21:53:56.270586 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-03-23 21:53:56.271622 | orchestrator | Sunday 23 March 2025 21:53:56 +0000 (0:00:00.184) 0:01:20.659 ********** 2025-03-23 21:53:56.439923 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:56.441504 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:56.442401 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:56.443367 | orchestrator | 2025-03-23 21:53:56.444173 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-03-23 21:53:56.446349 | orchestrator | Sunday 23 March 2025 21:53:56 +0000 (0:00:00.170) 0:01:20.830 ********** 2025-03-23 21:53:56.867222 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:56.868660 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:56.872698 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:56.875209 | orchestrator | 2025-03-23 21:53:56.875240 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-03-23 21:53:56.876267 | orchestrator | Sunday 23 March 2025 21:53:56 +0000 (0:00:00.427) 0:01:21.257 ********** 2025-03-23 21:53:57.065737 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:57.066867 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:57.067837 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:57.070721 | orchestrator | 2025-03-23 21:53:57.632777 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-03-23 21:53:57.632882 | orchestrator | Sunday 23 March 2025 21:53:57 +0000 (0:00:00.196) 0:01:21.454 ********** 2025-03-23 21:53:57.632912 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:53:57.632977 | orchestrator | 2025-03-23 21:53:57.633366 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-03-23 21:53:57.633424 | orchestrator | Sunday 23 March 2025 21:53:57 +0000 (0:00:00.569) 0:01:22.024 ********** 2025-03-23 21:53:58.165103 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:53:58.165689 | orchestrator | 2025-03-23 21:53:58.167052 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-03-23 21:53:58.170495 | orchestrator | Sunday 23 March 2025 21:53:58 +0000 (0:00:00.529) 0:01:22.553 ********** 2025-03-23 21:53:58.338408 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:53:58.338651 | orchestrator | 2025-03-23 21:53:58.342894 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-03-23 21:53:58.342954 | orchestrator | Sunday 23 March 2025 21:53:58 +0000 (0:00:00.175) 0:01:22.728 ********** 2025-03-23 21:53:58.533692 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'vg_name': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'}) 2025-03-23 21:53:58.534168 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'vg_name': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'}) 2025-03-23 21:53:58.535100 | orchestrator | 2025-03-23 21:53:58.535979 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-03-23 21:53:58.536382 | orchestrator | Sunday 23 March 2025 21:53:58 +0000 (0:00:00.195) 0:01:22.924 ********** 2025-03-23 21:53:58.726270 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:58.727204 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:58.728772 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:58.731245 | orchestrator | 2025-03-23 21:53:58.732238 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-03-23 21:53:58.732338 | orchestrator | Sunday 23 March 2025 21:53:58 +0000 (0:00:00.192) 0:01:23.117 ********** 2025-03-23 21:53:58.924448 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:58.924654 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:58.925686 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:58.926178 | orchestrator | 2025-03-23 21:53:58.926421 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-03-23 21:53:58.927910 | orchestrator | Sunday 23 March 2025 21:53:58 +0000 (0:00:00.197) 0:01:23.314 ********** 2025-03-23 21:53:59.115510 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b', 'data_vg': 'ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b'})  2025-03-23 21:53:59.119978 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438', 'data_vg': 'ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438'})  2025-03-23 21:53:59.122873 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:53:59.125222 | orchestrator | 2025-03-23 21:53:59.125398 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-03-23 21:53:59.126175 | orchestrator | Sunday 23 March 2025 21:53:59 +0000 (0:00:00.189) 0:01:23.504 ********** 2025-03-23 21:53:59.757209 | orchestrator | ok: [testbed-node-5] => { 2025-03-23 21:53:59.759518 | orchestrator |  "lvm_report": { 2025-03-23 21:53:59.760435 | orchestrator |  "lv": [ 2025-03-23 21:53:59.761388 | orchestrator |  { 2025-03-23 21:53:59.762494 | orchestrator |  "lv_name": "osd-block-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b", 2025-03-23 21:53:59.763496 | orchestrator |  "vg_name": "ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b" 2025-03-23 21:53:59.764338 | orchestrator |  }, 2025-03-23 21:53:59.765826 | orchestrator |  { 2025-03-23 21:53:59.766442 | orchestrator |  "lv_name": "osd-block-a4d93e20-7e7b-5457-96cd-ba2e435e9438", 2025-03-23 21:53:59.767213 | orchestrator |  "vg_name": "ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438" 2025-03-23 21:53:59.767886 | orchestrator |  } 2025-03-23 21:53:59.769259 | orchestrator |  ], 2025-03-23 21:53:59.769807 | orchestrator |  "pv": [ 2025-03-23 21:53:59.770831 | orchestrator |  { 2025-03-23 21:53:59.771629 | orchestrator |  "pv_name": "/dev/sdb", 2025-03-23 21:53:59.772068 | orchestrator |  "vg_name": "ceph-1b64d1ac-aa1a-58dd-947f-80f4fb53d79b" 2025-03-23 21:53:59.772917 | orchestrator |  }, 2025-03-23 21:53:59.773683 | orchestrator |  { 2025-03-23 21:53:59.774199 | orchestrator |  "pv_name": "/dev/sdc", 2025-03-23 21:53:59.775167 | orchestrator |  "vg_name": "ceph-a4d93e20-7e7b-5457-96cd-ba2e435e9438" 2025-03-23 21:53:59.777066 | orchestrator |  } 2025-03-23 21:53:59.778130 | orchestrator |  ] 2025-03-23 21:53:59.778162 | orchestrator |  } 2025-03-23 21:53:59.778395 | orchestrator | } 2025-03-23 21:53:59.779506 | orchestrator | 2025-03-23 21:53:59.780099 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:53:59.780690 | orchestrator | 2025-03-23 21:53:59 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 21:53:59.781165 | orchestrator | 2025-03-23 21:53:59 | INFO  | Please wait and do not abort execution. 2025-03-23 21:53:59.781830 | orchestrator | testbed-node-3 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-03-23 21:53:59.782376 | orchestrator | testbed-node-4 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-03-23 21:53:59.782797 | orchestrator | testbed-node-5 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-03-23 21:53:59.783482 | orchestrator | 2025-03-23 21:53:59.783710 | orchestrator | 2025-03-23 21:53:59.784296 | orchestrator | 2025-03-23 21:53:59.784989 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-23 21:53:59.785380 | orchestrator | Sunday 23 March 2025 21:53:59 +0000 (0:00:00.639) 0:01:24.144 ********** 2025-03-23 21:53:59.786420 | orchestrator | =============================================================================== 2025-03-23 21:53:59.786773 | orchestrator | Create block VGs -------------------------------------------------------- 6.96s 2025-03-23 21:53:59.787420 | orchestrator | Create block LVs -------------------------------------------------------- 4.53s 2025-03-23 21:53:59.788242 | orchestrator | Print LVM report data --------------------------------------------------- 2.51s 2025-03-23 21:53:59.788740 | orchestrator | Gather DB VGs with total and available size in bytes -------------------- 2.05s 2025-03-23 21:53:59.789143 | orchestrator | Add known links to the list of available block devices ------------------ 1.80s 2025-03-23 21:53:59.789646 | orchestrator | Get list of Ceph PVs with associated VGs -------------------------------- 1.74s 2025-03-23 21:53:59.790183 | orchestrator | Get list of Ceph LVs with associated VGs -------------------------------- 1.67s 2025-03-23 21:53:59.790549 | orchestrator | Gather WAL VGs with total and available size in bytes ------------------- 1.66s 2025-03-23 21:53:59.791463 | orchestrator | Gather DB+WAL VGs with total and available size in bytes ---------------- 1.66s 2025-03-23 21:53:59.791657 | orchestrator | Add known partitions to the list of available block devices ------------- 1.62s 2025-03-23 21:53:59.792108 | orchestrator | Add known partitions to the list of available block devices ------------- 1.29s 2025-03-23 21:53:59.792945 | orchestrator | Add known partitions to the list of available block devices ------------- 0.91s 2025-03-23 21:53:59.793048 | orchestrator | Add known links to the list of available block devices ------------------ 0.89s 2025-03-23 21:53:59.793823 | orchestrator | Fail if block LV defined in lvm_volumes is missing ---------------------- 0.86s 2025-03-23 21:53:59.794348 | orchestrator | Add known links to the list of available block devices ------------------ 0.86s 2025-03-23 21:53:59.794632 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 0.85s 2025-03-23 21:53:59.795227 | orchestrator | Print 'Create DB LVs for ceph_db_devices' ------------------------------- 0.83s 2025-03-23 21:53:59.795525 | orchestrator | Create DB LVs for ceph_db_wal_devices ----------------------------------- 0.81s 2025-03-23 21:53:59.796117 | orchestrator | Print 'Create DB VGs' --------------------------------------------------- 0.79s 2025-03-23 21:53:59.796274 | orchestrator | Get initial list of available block devices ----------------------------- 0.79s 2025-03-23 21:54:01.911253 | orchestrator | 2025-03-23 21:54:01 | INFO  | Task acc1a37f-3abd-4cdd-8ecc-c1437dd45751 (facts) was prepared for execution. 2025-03-23 21:54:05.262449 | orchestrator | 2025-03-23 21:54:01 | INFO  | It takes a moment until task acc1a37f-3abd-4cdd-8ecc-c1437dd45751 (facts) has been started and output is visible here. 2025-03-23 21:54:05.262643 | orchestrator | 2025-03-23 21:54:05.268290 | orchestrator | PLAY [Apply role facts] ******************************************************** 2025-03-23 21:54:05.272400 | orchestrator | 2025-03-23 21:54:05.272762 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-03-23 21:54:05.273742 | orchestrator | Sunday 23 March 2025 21:54:05 +0000 (0:00:00.212) 0:00:00.212 ********** 2025-03-23 21:54:06.762490 | orchestrator | ok: [testbed-manager] 2025-03-23 21:54:06.763483 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:54:06.763842 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:54:06.765362 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:54:06.766231 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:54:06.767908 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:54:06.768640 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:54:06.769149 | orchestrator | 2025-03-23 21:54:06.770741 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-03-23 21:54:06.771536 | orchestrator | Sunday 23 March 2025 21:54:06 +0000 (0:00:01.499) 0:00:01.711 ********** 2025-03-23 21:54:06.934551 | orchestrator | skipping: [testbed-manager] 2025-03-23 21:54:07.016978 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:54:07.098653 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:54:07.187665 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:54:07.291627 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:54:08.127351 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:54:08.127556 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:54:08.127637 | orchestrator | 2025-03-23 21:54:08.130993 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-03-23 21:54:08.132047 | orchestrator | 2025-03-23 21:54:08.132073 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-23 21:54:08.132094 | orchestrator | Sunday 23 March 2025 21:54:08 +0000 (0:00:01.368) 0:00:03.080 ********** 2025-03-23 21:54:12.918768 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:54:12.919717 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:54:12.920987 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:54:12.921404 | orchestrator | ok: [testbed-manager] 2025-03-23 21:54:12.922245 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:54:12.923040 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:54:12.924596 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:54:12.926273 | orchestrator | 2025-03-23 21:54:12.927056 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-03-23 21:54:12.927860 | orchestrator | 2025-03-23 21:54:12.928414 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-03-23 21:54:12.928819 | orchestrator | Sunday 23 March 2025 21:54:12 +0000 (0:00:04.793) 0:00:07.873 ********** 2025-03-23 21:54:13.327878 | orchestrator | skipping: [testbed-manager] 2025-03-23 21:54:13.443063 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:54:13.534863 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:54:13.623145 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:54:13.710545 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:54:13.748038 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:54:13.748213 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:54:13.748509 | orchestrator | 2025-03-23 21:54:13.748540 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:54:13.748906 | orchestrator | 2025-03-23 21:54:13 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-23 21:54:13.749726 | orchestrator | 2025-03-23 21:54:13 | INFO  | Please wait and do not abort execution. 2025-03-23 21:54:13.749758 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 21:54:13.750913 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 21:54:13.751380 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 21:54:13.751767 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 21:54:13.752295 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 21:54:13.752806 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 21:54:13.753713 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 21:54:13.754113 | orchestrator | 2025-03-23 21:54:13.754865 | orchestrator | Sunday 23 March 2025 21:54:13 +0000 (0:00:00.831) 0:00:08.704 ********** 2025-03-23 21:54:13.755916 | orchestrator | =============================================================================== 2025-03-23 21:54:13.756897 | orchestrator | Gathers facts about hosts ----------------------------------------------- 4.79s 2025-03-23 21:54:13.757352 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.50s 2025-03-23 21:54:13.757968 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.37s 2025-03-23 21:54:13.758767 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.83s 2025-03-23 21:54:14.444330 | orchestrator | 2025-03-23 21:54:14.446097 | orchestrator | --> DEPLOY IN A NUTSHELL -- START -- Sun Mar 23 21:54:14 UTC 2025 2025-03-23 21:54:16.111833 | orchestrator | 2025-03-23 21:54:16.111954 | orchestrator | 2025-03-23 21:54:16 | INFO  | Collection nutshell is prepared for execution 2025-03-23 21:54:16.116979 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [0] - dotfiles 2025-03-23 21:54:16.117032 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [0] - homer 2025-03-23 21:54:16.118352 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [0] - netdata 2025-03-23 21:54:16.118413 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [0] - openstackclient 2025-03-23 21:54:16.118430 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [0] - phpmyadmin 2025-03-23 21:54:16.118458 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [0] - common 2025-03-23 21:54:16.118481 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [1] -- loadbalancer 2025-03-23 21:54:16.118583 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [2] --- opensearch 2025-03-23 21:54:16.118605 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [2] --- mariadb-ng 2025-03-23 21:54:16.118619 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [3] ---- horizon 2025-03-23 21:54:16.118634 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [3] ---- keystone 2025-03-23 21:54:16.118652 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [4] ----- neutron 2025-03-23 21:54:16.118871 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [5] ------ wait-for-nova 2025-03-23 21:54:16.118926 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [5] ------ octavia 2025-03-23 21:54:16.118948 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [4] ----- barbican 2025-03-23 21:54:16.119096 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [4] ----- designate 2025-03-23 21:54:16.119121 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [4] ----- ironic 2025-03-23 21:54:16.119378 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [4] ----- placement 2025-03-23 21:54:16.119404 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [4] ----- magnum 2025-03-23 21:54:16.119458 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [1] -- openvswitch 2025-03-23 21:54:16.119524 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [2] --- ovn 2025-03-23 21:54:16.119544 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [1] -- memcached 2025-03-23 21:54:16.119752 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [1] -- redis 2025-03-23 21:54:16.119777 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [1] -- rabbitmq-ng 2025-03-23 21:54:16.119797 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [0] - kubernetes 2025-03-23 21:54:16.119883 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [1] -- kubeconfig 2025-03-23 21:54:16.119905 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [1] -- copy-kubeconfig 2025-03-23 21:54:16.121341 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [0] - ceph 2025-03-23 21:54:16.121372 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [1] -- ceph-pools 2025-03-23 21:54:16.121541 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [2] --- copy-ceph-keys 2025-03-23 21:54:16.121599 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [3] ---- cephclient 2025-03-23 21:54:16.121615 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [4] ----- ceph-bootstrap-dashboard 2025-03-23 21:54:16.121644 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [4] ----- wait-for-keystone 2025-03-23 21:54:16.121731 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [5] ------ kolla-ceph-rgw 2025-03-23 21:54:16.121749 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [5] ------ glance 2025-03-23 21:54:16.121763 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [5] ------ cinder 2025-03-23 21:54:16.121778 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [5] ------ nova 2025-03-23 21:54:16.121795 | orchestrator | 2025-03-23 21:54:16 | INFO  | A [4] ----- prometheus 2025-03-23 21:54:16.268250 | orchestrator | 2025-03-23 21:54:16 | INFO  | D [5] ------ grafana 2025-03-23 21:54:16.268340 | orchestrator | 2025-03-23 21:54:16 | INFO  | All tasks of the collection nutshell are prepared for execution 2025-03-23 21:54:18.316193 | orchestrator | 2025-03-23 21:54:16 | INFO  | Tasks are running in the background 2025-03-23 21:54:18.316333 | orchestrator | 2025-03-23 21:54:18 | INFO  | No task IDs specified, wait for all currently running tasks 2025-03-23 21:54:20.435053 | orchestrator | 2025-03-23 21:54:20 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:20.435799 | orchestrator | 2025-03-23 21:54:20 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:20.435840 | orchestrator | 2025-03-23 21:54:20 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:20.437269 | orchestrator | 2025-03-23 21:54:20 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:20.439002 | orchestrator | 2025-03-23 21:54:20 | INFO  | Task 40db9a70-46cd-49a0-a591-73f355d6d9e4 is in state STARTED 2025-03-23 21:54:20.441085 | orchestrator | 2025-03-23 21:54:20 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:20.441263 | orchestrator | 2025-03-23 21:54:20 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:23.493028 | orchestrator | 2025-03-23 21:54:23 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:23.495229 | orchestrator | 2025-03-23 21:54:23 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:23.498944 | orchestrator | 2025-03-23 21:54:23 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:23.500018 | orchestrator | 2025-03-23 21:54:23 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:23.500839 | orchestrator | 2025-03-23 21:54:23 | INFO  | Task 40db9a70-46cd-49a0-a591-73f355d6d9e4 is in state STARTED 2025-03-23 21:54:23.501509 | orchestrator | 2025-03-23 21:54:23 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:26.571842 | orchestrator | 2025-03-23 21:54:23 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:26.571971 | orchestrator | 2025-03-23 21:54:26 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:26.576765 | orchestrator | 2025-03-23 21:54:26 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:26.578369 | orchestrator | 2025-03-23 21:54:26 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:29.662832 | orchestrator | 2025-03-23 21:54:26 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:29.662968 | orchestrator | 2025-03-23 21:54:26 | INFO  | Task 40db9a70-46cd-49a0-a591-73f355d6d9e4 is in state STARTED 2025-03-23 21:54:29.662988 | orchestrator | 2025-03-23 21:54:26 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:29.663004 | orchestrator | 2025-03-23 21:54:26 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:29.663038 | orchestrator | 2025-03-23 21:54:29 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:29.663503 | orchestrator | 2025-03-23 21:54:29 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:29.663933 | orchestrator | 2025-03-23 21:54:29 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:29.665553 | orchestrator | 2025-03-23 21:54:29 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:29.666249 | orchestrator | 2025-03-23 21:54:29 | INFO  | Task 40db9a70-46cd-49a0-a591-73f355d6d9e4 is in state STARTED 2025-03-23 21:54:29.667199 | orchestrator | 2025-03-23 21:54:29 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:29.667259 | orchestrator | 2025-03-23 21:54:29 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:32.751158 | orchestrator | 2025-03-23 21:54:32 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:32.751916 | orchestrator | 2025-03-23 21:54:32 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:32.751953 | orchestrator | 2025-03-23 21:54:32 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:32.751976 | orchestrator | 2025-03-23 21:54:32 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:32.753933 | orchestrator | 2025-03-23 21:54:32 | INFO  | Task 40db9a70-46cd-49a0-a591-73f355d6d9e4 is in state STARTED 2025-03-23 21:54:32.757321 | orchestrator | 2025-03-23 21:54:32 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:35.826246 | orchestrator | 2025-03-23 21:54:32 | INFO [0m | Wait 1 second(s) until the next check 2025-03-23 21:54:35.826388 | orchestrator | 2025-03-23 21:54:35 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:35.833341 | orchestrator | 2025-03-23 21:54:35 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:35.838322 | orchestrator | 2025-03-23 21:54:35 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:35.838961 | orchestrator | 2025-03-23 21:54:35 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:35.841288 | orchestrator | 2025-03-23 21:54:35 | INFO  | Task 40db9a70-46cd-49a0-a591-73f355d6d9e4 is in state STARTED 2025-03-23 21:54:35.846192 | orchestrator | 2025-03-23 21:54:35 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:39.004489 | orchestrator | 2025-03-23 21:54:35 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:39.004645 | orchestrator | 2025-03-23 21:54:39 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:39.005169 | orchestrator | 2025-03-23 21:54:39 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:39.014868 | orchestrator | 2025-03-23 21:54:39 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:39.018219 | orchestrator | 2025-03-23 21:54:39 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:39.023337 | orchestrator | 2025-03-23 21:54:39 | INFO  | Task 40db9a70-46cd-49a0-a591-73f355d6d9e4 is in state STARTED 2025-03-23 21:54:39.023408 | orchestrator | 2025-03-23 21:54:39 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:42.104495 | orchestrator | 2025-03-23 21:54:39 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:42.104693 | orchestrator | 2025-03-23 21:54:42 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:42.112877 | orchestrator | 2025-03-23 21:54:42 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:42.114773 | orchestrator | 2025-03-23 21:54:42 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:42.114867 | orchestrator | 2025-03-23 21:54:42 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:42.114937 | orchestrator | 2025-03-23 21:54:42 | INFO  | Task 40db9a70-46cd-49a0-a591-73f355d6d9e4 is in state STARTED 2025-03-23 21:54:42.118489 | orchestrator | 2025-03-23 21:54:42 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:45.184881 | orchestrator | 2025-03-23 21:54:42 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:45.184993 | orchestrator | 2025-03-23 21:54:45 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:45.185673 | orchestrator | 2025-03-23 21:54:45 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:45.185708 | orchestrator | 2025-03-23 21:54:45 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:45.186957 | orchestrator | 2025-03-23 21:54:45 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:54:45.186984 | orchestrator | 2025-03-23 21:54:45.186994 | orchestrator | PLAY [Apply role geerlingguy.dotfiles] ***************************************** 2025-03-23 21:54:45.187003 | orchestrator | 2025-03-23 21:54:45.187012 | orchestrator | TASK [geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally.] **** 2025-03-23 21:54:45.187039 | orchestrator | Sunday 23 March 2025 21:54:25 +0000 (0:00:00.861) 0:00:00.861 ********** 2025-03-23 21:54:45.187048 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:54:45.187058 | orchestrator | changed: [testbed-manager] 2025-03-23 21:54:45.187066 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:54:45.187075 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:54:45.187084 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:54:45.187092 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:54:45.187101 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:54:45.187110 | orchestrator | 2025-03-23 21:54:45.187118 | orchestrator | TASK [geerlingguy.dotfiles : Ensure all configured dotfiles are links.] ******** 2025-03-23 21:54:45.187127 | orchestrator | Sunday 23 March 2025 21:54:29 +0000 (0:00:04.016) 0:00:04.877 ********** 2025-03-23 21:54:45.187136 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2025-03-23 21:54:45.187146 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2025-03-23 21:54:45.187160 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2025-03-23 21:54:45.187169 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2025-03-23 21:54:45.187177 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2025-03-23 21:54:45.187186 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2025-03-23 21:54:45.187194 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2025-03-23 21:54:45.187203 | orchestrator | 2025-03-23 21:54:45.187212 | orchestrator | TASK [geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked.] *** 2025-03-23 21:54:45.187221 | orchestrator | Sunday 23 March 2025 21:54:31 +0000 (0:00:02.638) 0:00:07.516 ********** 2025-03-23 21:54:45.187232 | orchestrator | ok: [testbed-node-1] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-23 21:54:30.238168', 'end': '2025-03-23 21:54:30.244727', 'delta': '0:00:00.006559', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-23 21:54:45.187247 | orchestrator | ok: [testbed-node-0] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-23 21:54:30.032986', 'end': '2025-03-23 21:54:30.040199', 'delta': '0:00:00.007213', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-23 21:54:45.187257 | orchestrator | ok: [testbed-manager] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-23 21:54:30.054030', 'end': '2025-03-23 21:54:30.061956', 'delta': '0:00:00.007926', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-23 21:54:45.187290 | orchestrator | ok: [testbed-node-2] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-23 21:54:30.690488', 'end': '2025-03-23 21:54:30.699701', 'delta': '0:00:00.009213', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-23 21:54:45.187300 | orchestrator | ok: [testbed-node-3] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-23 21:54:31.127832', 'end': '2025-03-23 21:54:31.138266', 'delta': '0:00:00.010434', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-23 21:54:45.187309 | orchestrator | ok: [testbed-node-4] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-23 21:54:31.332527', 'end': '2025-03-23 21:54:31.341994', 'delta': '0:00:00.009467', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-23 21:54:45.187322 | orchestrator | ok: [testbed-node-5] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-23 21:54:31.690538', 'end': '2025-03-23 21:54:31.699097', 'delta': '0:00:00.008559', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-23 21:54:45.187331 | orchestrator | 2025-03-23 21:54:45.187340 | orchestrator | TASK [geerlingguy.dotfiles : Link dotfiles into home folder.] ****************** 2025-03-23 21:54:45.187349 | orchestrator | Sunday 23 March 2025 21:54:35 +0000 (0:00:04.019) 0:00:11.536 ********** 2025-03-23 21:54:45.187357 | orchestrator | changed: [testbed-manager] => (item=.tmux.conf) 2025-03-23 21:54:45.187366 | orchestrator | changed: [testbed-node-0] => (item=.tmux.conf) 2025-03-23 21:54:45.187375 | orchestrator | changed: [testbed-node-1] => (item=.tmux.conf) 2025-03-23 21:54:45.187388 | orchestrator | changed: [testbed-node-2] => (item=.tmux.conf) 2025-03-23 21:54:45.187397 | orchestrator | changed: [testbed-node-3] => (item=.tmux.conf) 2025-03-23 21:54:45.187405 | orchestrator | changed: [testbed-node-4] => (item=.tmux.conf) 2025-03-23 21:54:45.187414 | orchestrator | changed: [testbed-node-5] => (item=.tmux.conf) 2025-03-23 21:54:45.187422 | orchestrator | 2025-03-23 21:54:45.187431 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:54:45.187440 | orchestrator | testbed-manager : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:54:45.187450 | orchestrator | testbed-node-0 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:54:45.187459 | orchestrator | testbed-node-1 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:54:45.187472 | orchestrator | testbed-node-2 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:54:45.187492 | orchestrator | testbed-node-3 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:54:45.187502 | orchestrator | testbed-node-4 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:54:45.187511 | orchestrator | testbed-node-5 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:54:45.187519 | orchestrator | 2025-03-23 21:54:45.187528 | orchestrator | Sunday 23 March 2025 21:54:41 +0000 (0:00:05.675) 0:00:17.212 ********** 2025-03-23 21:54:45.187537 | orchestrator | =============================================================================== 2025-03-23 21:54:45.187545 | orchestrator | geerlingguy.dotfiles : Link dotfiles into home folder. ------------------ 5.68s 2025-03-23 21:54:45.187554 | orchestrator | geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked. --- 4.02s 2025-03-23 21:54:45.187586 | orchestrator | geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally. ---- 4.02s 2025-03-23 21:54:45.187596 | orchestrator | geerlingguy.dotfiles : Ensure all configured dotfiles are links. -------- 2.64s 2025-03-23 21:54:45.187608 | orchestrator | 2025-03-23 21:54:45 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:45.187808 | orchestrator | 2025-03-23 21:54:45 | INFO  | Task 40db9a70-46cd-49a0-a591-73f355d6d9e4 is in state SUCCESS 2025-03-23 21:54:45.187827 | orchestrator | 2025-03-23 21:54:45 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:48.259718 | orchestrator | 2025-03-23 21:54:45 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:48.259871 | orchestrator | 2025-03-23 21:54:48 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:48.263912 | orchestrator | 2025-03-23 21:54:48 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:48.266901 | orchestrator | 2025-03-23 21:54:48 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:48.267520 | orchestrator | 2025-03-23 21:54:48 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:54:48.267550 | orchestrator | 2025-03-23 21:54:48 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:48.268318 | orchestrator | 2025-03-23 21:54:48 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:51.417273 | orchestrator | 2025-03-23 21:54:48 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:51.417407 | orchestrator | 2025-03-23 21:54:51 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:51.442848 | orchestrator | 2025-03-23 21:54:51 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:51.470006 | orchestrator | 2025-03-23 21:54:51 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:51.501890 | orchestrator | 2025-03-23 21:54:51 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:54:51.511170 | orchestrator | 2025-03-23 21:54:51 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:51.526999 | orchestrator | 2025-03-23 21:54:51 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:54.689314 | orchestrator | 2025-03-23 21:54:51 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:54.689448 | orchestrator | 2025-03-23 21:54:54 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:54.699893 | orchestrator | 2025-03-23 21:54:54 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:54.707317 | orchestrator | 2025-03-23 21:54:54 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:54.723757 | orchestrator | 2025-03-23 21:54:54 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:54:54.726394 | orchestrator | 2025-03-23 21:54:54 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:54.726443 | orchestrator | 2025-03-23 21:54:54 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:54:54.726468 | orchestrator | 2025-03-23 21:54:54 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:54:57.857516 | orchestrator | 2025-03-23 21:54:57 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:54:57.865301 | orchestrator | 2025-03-23 21:54:57 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:54:57.866391 | orchestrator | 2025-03-23 21:54:57 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:54:57.866431 | orchestrator | 2025-03-23 21:54:57 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:54:57.869228 | orchestrator | 2025-03-23 21:54:57 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:54:57.871538 | orchestrator | 2025-03-23 21:54:57 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:00.940244 | orchestrator | 2025-03-23 21:54:57 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:00.940345 | orchestrator | 2025-03-23 21:55:00 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:00.940946 | orchestrator | 2025-03-23 21:55:00 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:00.940978 | orchestrator | 2025-03-23 21:55:00 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:55:00.944139 | orchestrator | 2025-03-23 21:55:00 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:00.946740 | orchestrator | 2025-03-23 21:55:00 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:55:00.953871 | orchestrator | 2025-03-23 21:55:00 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:04.025947 | orchestrator | 2025-03-23 21:55:00 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:04.026115 | orchestrator | 2025-03-23 21:55:04 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:07.102907 | orchestrator | 2025-03-23 21:55:04 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:07.103029 | orchestrator | 2025-03-23 21:55:04 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state STARTED 2025-03-23 21:55:07.103049 | orchestrator | 2025-03-23 21:55:04 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:07.103064 | orchestrator | 2025-03-23 21:55:04 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:55:07.103079 | orchestrator | 2025-03-23 21:55:04 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:07.103094 | orchestrator | 2025-03-23 21:55:04 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:07.103126 | orchestrator | 2025-03-23 21:55:07 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:07.105766 | orchestrator | 2025-03-23 21:55:07 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:07.110118 | orchestrator | 2025-03-23 21:55:07 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:10.173236 | orchestrator | 2025-03-23 21:55:07 | INFO  | Task c24e0eb3-9a64-45b3-84a0-18b4dd58fba4 is in state SUCCESS 2025-03-23 21:55:10.173362 | orchestrator | 2025-03-23 21:55:07 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:10.173390 | orchestrator | 2025-03-23 21:55:07 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:55:10.173421 | orchestrator | 2025-03-23 21:55:07 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:10.173436 | orchestrator | 2025-03-23 21:55:07 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:10.173467 | orchestrator | 2025-03-23 21:55:10 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:10.175017 | orchestrator | 2025-03-23 21:55:10 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:10.175705 | orchestrator | 2025-03-23 21:55:10 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:10.180696 | orchestrator | 2025-03-23 21:55:10 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:10.181533 | orchestrator | 2025-03-23 21:55:10 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:55:10.181589 | orchestrator | 2025-03-23 21:55:10 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:10.181649 | orchestrator | 2025-03-23 21:55:10 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:13.245694 | orchestrator | 2025-03-23 21:55:13 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:13.251757 | orchestrator | 2025-03-23 21:55:13 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:13.253401 | orchestrator | 2025-03-23 21:55:13 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:13.253435 | orchestrator | 2025-03-23 21:55:13 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:13.256394 | orchestrator | 2025-03-23 21:55:13 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:55:13.259808 | orchestrator | 2025-03-23 21:55:13 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:16.368028 | orchestrator | 2025-03-23 21:55:13 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:16.368201 | orchestrator | 2025-03-23 21:55:16 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:16.377856 | orchestrator | 2025-03-23 21:55:16 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:16.377885 | orchestrator | 2025-03-23 21:55:16 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:16.377905 | orchestrator | 2025-03-23 21:55:16 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:19.452033 | orchestrator | 2025-03-23 21:55:16 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:55:19.452140 | orchestrator | 2025-03-23 21:55:16 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:19.452157 | orchestrator | 2025-03-23 21:55:16 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:19.452187 | orchestrator | 2025-03-23 21:55:19 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:19.461424 | orchestrator | 2025-03-23 21:55:19 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:19.465073 | orchestrator | 2025-03-23 21:55:19 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:19.467712 | orchestrator | 2025-03-23 21:55:19 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:19.470756 | orchestrator | 2025-03-23 21:55:19 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:55:19.472651 | orchestrator | 2025-03-23 21:55:19 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:19.472902 | orchestrator | 2025-03-23 21:55:19 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:22.588007 | orchestrator | 2025-03-23 21:55:22 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:22.590165 | orchestrator | 2025-03-23 21:55:22 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:22.590210 | orchestrator | 2025-03-23 21:55:22 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:22.590751 | orchestrator | 2025-03-23 21:55:22 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:22.595845 | orchestrator | 2025-03-23 21:55:22 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:55:25.736998 | orchestrator | 2025-03-23 21:55:22 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:25.737119 | orchestrator | 2025-03-23 21:55:22 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:25.737156 | orchestrator | 2025-03-23 21:55:25 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:25.742995 | orchestrator | 2025-03-23 21:55:25 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:25.749044 | orchestrator | 2025-03-23 21:55:25 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:25.749090 | orchestrator | 2025-03-23 21:55:25 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:28.856548 | orchestrator | 2025-03-23 21:55:25 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:55:28.856718 | orchestrator | 2025-03-23 21:55:25 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:28.856738 | orchestrator | 2025-03-23 21:55:25 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:28.856847 | orchestrator | 2025-03-23 21:55:28 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:28.856937 | orchestrator | 2025-03-23 21:55:28 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:28.865872 | orchestrator | 2025-03-23 21:55:28 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:28.869648 | orchestrator | 2025-03-23 21:55:28 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:28.874624 | orchestrator | 2025-03-23 21:55:28 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state STARTED 2025-03-23 21:55:28.875325 | orchestrator | 2025-03-23 21:55:28 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:28.875449 | orchestrator | 2025-03-23 21:55:28 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:31.973135 | orchestrator | 2025-03-23 21:55:31 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:31.980403 | orchestrator | 2025-03-23 21:55:31 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:31.997305 | orchestrator | 2025-03-23 21:55:31 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:32.007064 | orchestrator | 2025-03-23 21:55:32 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:32.010580 | orchestrator | 2025-03-23 21:55:32 | INFO  | Task 9ebfbe2e-f9b9-4380-a7d3-7cea905d1f6f is in state SUCCESS 2025-03-23 21:55:32.019984 | orchestrator | 2025-03-23 21:55:32 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:35.127040 | orchestrator | 2025-03-23 21:55:32 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:35.127180 | orchestrator | 2025-03-23 21:55:35 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:35.127457 | orchestrator | 2025-03-23 21:55:35 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:35.127496 | orchestrator | 2025-03-23 21:55:35 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:35.129440 | orchestrator | 2025-03-23 21:55:35 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:35.132666 | orchestrator | 2025-03-23 21:55:35 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:38.211374 | orchestrator | 2025-03-23 21:55:35 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:38.211523 | orchestrator | 2025-03-23 21:55:38 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:38.218354 | orchestrator | 2025-03-23 21:55:38 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:38.225459 | orchestrator | 2025-03-23 21:55:38 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:38.225510 | orchestrator | 2025-03-23 21:55:38 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:38.228238 | orchestrator | 2025-03-23 21:55:38 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:41.301120 | orchestrator | 2025-03-23 21:55:38 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:41.301252 | orchestrator | 2025-03-23 21:55:41 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:41.306388 | orchestrator | 2025-03-23 21:55:41 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:41.311165 | orchestrator | 2025-03-23 21:55:41 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:41.316243 | orchestrator | 2025-03-23 21:55:41 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:41.319052 | orchestrator | 2025-03-23 21:55:41 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:41.319200 | orchestrator | 2025-03-23 21:55:41 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:44.393532 | orchestrator | 2025-03-23 21:55:44 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:44.403940 | orchestrator | 2025-03-23 21:55:44 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:44.404261 | orchestrator | 2025-03-23 21:55:44 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:44.404287 | orchestrator | 2025-03-23 21:55:44 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:44.404306 | orchestrator | 2025-03-23 21:55:44 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:44.407090 | orchestrator | 2025-03-23 21:55:44 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:47.487224 | orchestrator | 2025-03-23 21:55:47 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:47.495234 | orchestrator | 2025-03-23 21:55:47 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:47.498403 | orchestrator | 2025-03-23 21:55:47 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:47.506164 | orchestrator | 2025-03-23 21:55:47 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:47.509223 | orchestrator | 2025-03-23 21:55:47 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:50.567747 | orchestrator | 2025-03-23 21:55:47 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:50.567882 | orchestrator | 2025-03-23 21:55:50 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:50.568002 | orchestrator | 2025-03-23 21:55:50 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:50.568042 | orchestrator | 2025-03-23 21:55:50 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:50.568062 | orchestrator | 2025-03-23 21:55:50 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state STARTED 2025-03-23 21:55:50.570390 | orchestrator | 2025-03-23 21:55:50 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:53.637139 | orchestrator | 2025-03-23 21:55:50 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:53.637283 | orchestrator | 2025-03-23 21:55:53 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:53.642135 | orchestrator | 2025-03-23 21:55:53 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:53.642465 | orchestrator | 2025-03-23 21:55:53 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:53.644505 | orchestrator | 2025-03-23 21:55:53.644533 | orchestrator | 2025-03-23 21:55:53.644548 | orchestrator | PLAY [Apply role homer] ******************************************************** 2025-03-23 21:55:53.644625 | orchestrator | 2025-03-23 21:55:53.644641 | orchestrator | TASK [osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards] *** 2025-03-23 21:55:53.644656 | orchestrator | Sunday 23 March 2025 21:54:25 +0000 (0:00:00.454) 0:00:00.454 ********** 2025-03-23 21:55:53.644730 | orchestrator | ok: [testbed-manager] => { 2025-03-23 21:55:53.644748 | orchestrator |  "msg": "The support for the homer_url_kibana has been removed. Please use the homer_url_opensearch_dashboards parameter." 2025-03-23 21:55:53.644765 | orchestrator | } 2025-03-23 21:55:53.644780 | orchestrator | 2025-03-23 21:55:53.644794 | orchestrator | TASK [osism.services.homer : Create traefik external network] ****************** 2025-03-23 21:55:53.644808 | orchestrator | Sunday 23 March 2025 21:54:26 +0000 (0:00:00.500) 0:00:00.954 ********** 2025-03-23 21:55:53.644822 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:53.644838 | orchestrator | 2025-03-23 21:55:53.644852 | orchestrator | TASK [osism.services.homer : Create required directories] ********************** 2025-03-23 21:55:53.644866 | orchestrator | Sunday 23 March 2025 21:54:28 +0000 (0:00:01.928) 0:00:02.883 ********** 2025-03-23 21:55:53.644881 | orchestrator | changed: [testbed-manager] => (item=/opt/homer/configuration) 2025-03-23 21:55:53.644895 | orchestrator | ok: [testbed-manager] => (item=/opt/homer) 2025-03-23 21:55:53.644909 | orchestrator | 2025-03-23 21:55:53.644924 | orchestrator | TASK [osism.services.homer : Copy config.yml configuration file] *************** 2025-03-23 21:55:53.644938 | orchestrator | Sunday 23 March 2025 21:54:29 +0000 (0:00:01.391) 0:00:04.275 ********** 2025-03-23 21:55:53.644952 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:53.644967 | orchestrator | 2025-03-23 21:55:53.644981 | orchestrator | TASK [osism.services.homer : Copy docker-compose.yml file] ********************* 2025-03-23 21:55:53.644995 | orchestrator | Sunday 23 March 2025 21:54:33 +0000 (0:00:03.793) 0:00:08.069 ********** 2025-03-23 21:55:53.645010 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:53.645024 | orchestrator | 2025-03-23 21:55:53.645038 | orchestrator | TASK [osism.services.homer : Manage homer service] ***************************** 2025-03-23 21:55:53.645052 | orchestrator | Sunday 23 March 2025 21:54:35 +0000 (0:00:02.071) 0:00:10.140 ********** 2025-03-23 21:55:53.645066 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage homer service (10 retries left). 2025-03-23 21:55:53.645081 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:53.645095 | orchestrator | 2025-03-23 21:55:53.645110 | orchestrator | RUNNING HANDLER [osism.services.homer : Restart homer service] ***************** 2025-03-23 21:55:53.645125 | orchestrator | Sunday 23 March 2025 21:55:01 +0000 (0:00:26.578) 0:00:36.719 ********** 2025-03-23 21:55:53.645141 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:53.645156 | orchestrator | 2025-03-23 21:55:53.645171 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:55:53.645187 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:55:53.645204 | orchestrator | 2025-03-23 21:55:53.645219 | orchestrator | Sunday 23 March 2025 21:55:04 +0000 (0:00:02.880) 0:00:39.599 ********** 2025-03-23 21:55:53.645234 | orchestrator | =============================================================================== 2025-03-23 21:55:53.645250 | orchestrator | osism.services.homer : Manage homer service ---------------------------- 26.58s 2025-03-23 21:55:53.645265 | orchestrator | osism.services.homer : Copy config.yml configuration file --------------- 3.79s 2025-03-23 21:55:53.645280 | orchestrator | osism.services.homer : Restart homer service ---------------------------- 2.88s 2025-03-23 21:55:53.645295 | orchestrator | osism.services.homer : Copy docker-compose.yml file --------------------- 2.07s 2025-03-23 21:55:53.645310 | orchestrator | osism.services.homer : Create traefik external network ------------------ 1.93s 2025-03-23 21:55:53.645331 | orchestrator | osism.services.homer : Create required directories ---------------------- 1.39s 2025-03-23 21:55:53.645347 | orchestrator | osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards --- 0.50s 2025-03-23 21:55:53.645362 | orchestrator | 2025-03-23 21:55:53.645378 | orchestrator | 2025-03-23 21:55:53.645393 | orchestrator | PLAY [Apply role openstackclient] ********************************************** 2025-03-23 21:55:53.645408 | orchestrator | 2025-03-23 21:55:53.645423 | orchestrator | TASK [osism.services.openstackclient : Include tasks] ************************** 2025-03-23 21:55:53.645438 | orchestrator | Sunday 23 March 2025 21:54:26 +0000 (0:00:00.285) 0:00:00.285 ********** 2025-03-23 21:55:53.645461 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/openstackclient/tasks/container-Debian-family.yml for testbed-manager 2025-03-23 21:55:53.645478 | orchestrator | 2025-03-23 21:55:53.645493 | orchestrator | TASK [osism.services.openstackclient : Create required directories] ************ 2025-03-23 21:55:53.645508 | orchestrator | Sunday 23 March 2025 21:54:27 +0000 (0:00:00.488) 0:00:00.774 ********** 2025-03-23 21:55:53.645522 | orchestrator | changed: [testbed-manager] => (item=/opt/configuration/environments/openstack) 2025-03-23 21:55:53.645536 | orchestrator | changed: [testbed-manager] => (item=/opt/openstackclient/data) 2025-03-23 21:55:53.645550 | orchestrator | ok: [testbed-manager] => (item=/opt/openstackclient) 2025-03-23 21:55:53.645588 | orchestrator | 2025-03-23 21:55:53.645603 | orchestrator | TASK [osism.services.openstackclient : Copy docker-compose.yml file] *********** 2025-03-23 21:55:53.645617 | orchestrator | Sunday 23 March 2025 21:54:29 +0000 (0:00:02.320) 0:00:03.094 ********** 2025-03-23 21:55:53.645632 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:53.645646 | orchestrator | 2025-03-23 21:55:53.645660 | orchestrator | TASK [osism.services.openstackclient : Manage openstackclient service] ********* 2025-03-23 21:55:53.645675 | orchestrator | Sunday 23 March 2025 21:54:32 +0000 (0:00:02.694) 0:00:05.789 ********** 2025-03-23 21:55:53.645689 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage openstackclient service (10 retries left). 2025-03-23 21:55:53.645704 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:53.645718 | orchestrator | 2025-03-23 21:55:53.645744 | orchestrator | TASK [osism.services.openstackclient : Copy openstack wrapper script] ********** 2025-03-23 21:55:53.645779 | orchestrator | Sunday 23 March 2025 21:55:11 +0000 (0:00:39.324) 0:00:45.113 ********** 2025-03-23 21:55:53.645795 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:53.645809 | orchestrator | 2025-03-23 21:55:53.645823 | orchestrator | TASK [osism.services.openstackclient : Remove ospurge wrapper script] ********** 2025-03-23 21:55:53.645837 | orchestrator | Sunday 23 March 2025 21:55:13 +0000 (0:00:02.062) 0:00:47.175 ********** 2025-03-23 21:55:53.645851 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:53.645865 | orchestrator | 2025-03-23 21:55:53.645879 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Restart openstackclient service] *** 2025-03-23 21:55:53.645892 | orchestrator | Sunday 23 March 2025 21:55:15 +0000 (0:00:01.375) 0:00:48.551 ********** 2025-03-23 21:55:53.645906 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:53.645920 | orchestrator | 2025-03-23 21:55:53.645934 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Ensure that all containers are up] *** 2025-03-23 21:55:53.645948 | orchestrator | Sunday 23 March 2025 21:55:18 +0000 (0:00:03.530) 0:00:52.082 ********** 2025-03-23 21:55:53.645962 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:53.645976 | orchestrator | 2025-03-23 21:55:53.645990 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Wait for an healthy service] *** 2025-03-23 21:55:53.646004 | orchestrator | Sunday 23 March 2025 21:55:21 +0000 (0:00:02.450) 0:00:54.532 ********** 2025-03-23 21:55:53.646066 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:53.646085 | orchestrator | 2025-03-23 21:55:53.646100 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Copy bash completion script] *** 2025-03-23 21:55:53.646114 | orchestrator | Sunday 23 March 2025 21:55:22 +0000 (0:00:01.776) 0:00:56.312 ********** 2025-03-23 21:55:53.646128 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:53.646142 | orchestrator | 2025-03-23 21:55:53.646157 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:55:53.646170 | orchestrator | testbed-manager : ok=10  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:55:53.646185 | orchestrator | 2025-03-23 21:55:53.646199 | orchestrator | Sunday 23 March 2025 21:55:23 +0000 (0:00:00.823) 0:00:57.136 ********** 2025-03-23 21:55:53.646213 | orchestrator | =============================================================================== 2025-03-23 21:55:53.646235 | orchestrator | osism.services.openstackclient : Manage openstackclient service -------- 39.32s 2025-03-23 21:55:53.646250 | orchestrator | osism.services.openstackclient : Restart openstackclient service -------- 3.53s 2025-03-23 21:55:53.646264 | orchestrator | osism.services.openstackclient : Copy docker-compose.yml file ----------- 2.70s 2025-03-23 21:55:53.646278 | orchestrator | osism.services.openstackclient : Ensure that all containers are up ------ 2.45s 2025-03-23 21:55:53.646292 | orchestrator | osism.services.openstackclient : Create required directories ------------ 2.32s 2025-03-23 21:55:53.646313 | orchestrator | osism.services.openstackclient : Copy openstack wrapper script ---------- 2.06s 2025-03-23 21:55:53.646327 | orchestrator | osism.services.openstackclient : Wait for an healthy service ------------ 1.78s 2025-03-23 21:55:53.646342 | orchestrator | osism.services.openstackclient : Remove ospurge wrapper script ---------- 1.38s 2025-03-23 21:55:53.646356 | orchestrator | osism.services.openstackclient : Copy bash completion script ------------ 0.82s 2025-03-23 21:55:53.646370 | orchestrator | osism.services.openstackclient : Include tasks -------------------------- 0.49s 2025-03-23 21:55:53.646384 | orchestrator | 2025-03-23 21:55:53.646403 | orchestrator | 2025-03-23 21:55:53 | INFO  | Task b951ac8d-90d0-4951-a13e-8711145c5d50 is in state SUCCESS 2025-03-23 21:55:53.646486 | orchestrator | 2025-03-23 21:55:53 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:56.728531 | orchestrator | 2025-03-23 21:55:53 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:56.728723 | orchestrator | 2025-03-23 21:55:56 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:56.728824 | orchestrator | 2025-03-23 21:55:56 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:56.731347 | orchestrator | 2025-03-23 21:55:56 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state STARTED 2025-03-23 21:55:56.732359 | orchestrator | 2025-03-23 21:55:56 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:55:59.813506 | orchestrator | 2025-03-23 21:55:56 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:55:59.813677 | orchestrator | 2025-03-23 21:55:59 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:55:59.821267 | orchestrator | 2025-03-23 21:55:59.821307 | orchestrator | PLAY [Apply role phpmyadmin] *************************************************** 2025-03-23 21:55:59.821324 | orchestrator | 2025-03-23 21:55:59.821340 | orchestrator | TASK [osism.services.phpmyadmin : Create traefik external network] ************* 2025-03-23 21:55:59.821356 | orchestrator | Sunday 23 March 2025 21:54:49 +0000 (0:00:00.515) 0:00:00.515 ********** 2025-03-23 21:55:59.821372 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:59.821389 | orchestrator | 2025-03-23 21:55:59.821405 | orchestrator | TASK [osism.services.phpmyadmin : Create required directories] ***************** 2025-03-23 21:55:59.821420 | orchestrator | Sunday 23 March 2025 21:54:51 +0000 (0:00:02.155) 0:00:02.671 ********** 2025-03-23 21:55:59.821436 | orchestrator | changed: [testbed-manager] => (item=/opt/phpmyadmin) 2025-03-23 21:55:59.821452 | orchestrator | 2025-03-23 21:55:59.821467 | orchestrator | TASK [osism.services.phpmyadmin : Copy docker-compose.yml file] **************** 2025-03-23 21:55:59.821482 | orchestrator | Sunday 23 March 2025 21:54:53 +0000 (0:00:02.012) 0:00:04.683 ********** 2025-03-23 21:55:59.821497 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:59.821512 | orchestrator | 2025-03-23 21:55:59.821527 | orchestrator | TASK [osism.services.phpmyadmin : Manage phpmyadmin service] ******************* 2025-03-23 21:55:59.821542 | orchestrator | Sunday 23 March 2025 21:54:57 +0000 (0:00:03.876) 0:00:08.559 ********** 2025-03-23 21:55:59.821587 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage phpmyadmin service (10 retries left). 2025-03-23 21:55:59.821603 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:59.821617 | orchestrator | 2025-03-23 21:55:59.821631 | orchestrator | RUNNING HANDLER [osism.services.phpmyadmin : Restart phpmyadmin service] ******* 2025-03-23 21:55:59.821670 | orchestrator | Sunday 23 March 2025 21:55:46 +0000 (0:00:48.990) 0:00:57.550 ********** 2025-03-23 21:55:59.821684 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:59.821698 | orchestrator | 2025-03-23 21:55:59.821712 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:55:59.821727 | orchestrator | testbed-manager : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:55:59.821742 | orchestrator | 2025-03-23 21:55:59.821756 | orchestrator | Sunday 23 March 2025 21:55:49 +0000 (0:00:03.672) 0:01:01.223 ********** 2025-03-23 21:55:59.821770 | orchestrator | =============================================================================== 2025-03-23 21:55:59.821784 | orchestrator | osism.services.phpmyadmin : Manage phpmyadmin service ------------------ 48.99s 2025-03-23 21:55:59.821798 | orchestrator | osism.services.phpmyadmin : Copy docker-compose.yml file ---------------- 3.87s 2025-03-23 21:55:59.821812 | orchestrator | osism.services.phpmyadmin : Restart phpmyadmin service ------------------ 3.67s 2025-03-23 21:55:59.821826 | orchestrator | osism.services.phpmyadmin : Create traefik external network ------------- 2.16s 2025-03-23 21:55:59.821840 | orchestrator | osism.services.phpmyadmin : Create required directories ----------------- 2.01s 2025-03-23 21:55:59.821854 | orchestrator | 2025-03-23 21:55:59.821868 | orchestrator | 2025-03-23 21:55:59.821884 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-23 21:55:59.821899 | orchestrator | 2025-03-23 21:55:59.821913 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-23 21:55:59.821928 | orchestrator | Sunday 23 March 2025 21:54:26 +0000 (0:00:00.605) 0:00:00.605 ********** 2025-03-23 21:55:59.821943 | orchestrator | changed: [testbed-manager] => (item=enable_netdata_True) 2025-03-23 21:55:59.821958 | orchestrator | changed: [testbed-node-0] => (item=enable_netdata_True) 2025-03-23 21:55:59.821973 | orchestrator | changed: [testbed-node-1] => (item=enable_netdata_True) 2025-03-23 21:55:59.821989 | orchestrator | changed: [testbed-node-2] => (item=enable_netdata_True) 2025-03-23 21:55:59.822004 | orchestrator | changed: [testbed-node-3] => (item=enable_netdata_True) 2025-03-23 21:55:59.822066 | orchestrator | changed: [testbed-node-4] => (item=enable_netdata_True) 2025-03-23 21:55:59.822085 | orchestrator | changed: [testbed-node-5] => (item=enable_netdata_True) 2025-03-23 21:55:59.822101 | orchestrator | 2025-03-23 21:55:59.822117 | orchestrator | PLAY [Apply role netdata] ****************************************************** 2025-03-23 21:55:59.822132 | orchestrator | 2025-03-23 21:55:59.822147 | orchestrator | TASK [osism.services.netdata : Include distribution specific install tasks] **** 2025-03-23 21:55:59.822162 | orchestrator | Sunday 23 March 2025 21:54:28 +0000 (0:00:01.861) 0:00:02.466 ********** 2025-03-23 21:55:59.822193 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-23 21:55:59.822212 | orchestrator | 2025-03-23 21:55:59.822233 | orchestrator | TASK [osism.services.netdata : Remove old architecture-dependent repository] *** 2025-03-23 21:55:59.822249 | orchestrator | Sunday 23 March 2025 21:54:32 +0000 (0:00:03.680) 0:00:06.147 ********** 2025-03-23 21:55:59.822265 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:55:59.822280 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:55:59.822294 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:59.822308 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:55:59.822322 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:55:59.822336 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:55:59.822350 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:55:59.822364 | orchestrator | 2025-03-23 21:55:59.822378 | orchestrator | TASK [osism.services.netdata : Install apt-transport-https package] ************ 2025-03-23 21:55:59.822391 | orchestrator | Sunday 23 March 2025 21:54:36 +0000 (0:00:03.696) 0:00:09.844 ********** 2025-03-23 21:55:59.822405 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:55:59.822419 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:55:59.822442 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:59.822456 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:55:59.822470 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:55:59.822484 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:55:59.822510 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:55:59.822524 | orchestrator | 2025-03-23 21:55:59.822539 | orchestrator | TASK [osism.services.netdata : Add repository gpg key] ************************* 2025-03-23 21:55:59.822586 | orchestrator | Sunday 23 March 2025 21:54:41 +0000 (0:00:05.809) 0:00:15.654 ********** 2025-03-23 21:55:59.822604 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:59.822618 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:55:59.822632 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:55:59.822646 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:55:59.822660 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:55:59.822674 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:55:59.822688 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:55:59.822701 | orchestrator | 2025-03-23 21:55:59.822715 | orchestrator | TASK [osism.services.netdata : Add repository] ********************************* 2025-03-23 21:55:59.822729 | orchestrator | Sunday 23 March 2025 21:54:44 +0000 (0:00:02.710) 0:00:18.365 ********** 2025-03-23 21:55:59.822743 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:55:59.822757 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:55:59.822771 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:55:59.822785 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:55:59.822799 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:55:59.822813 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:55:59.822827 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:59.822841 | orchestrator | 2025-03-23 21:55:59.822855 | orchestrator | TASK [osism.services.netdata : Install package netdata] ************************ 2025-03-23 21:55:59.822869 | orchestrator | Sunday 23 March 2025 21:54:55 +0000 (0:00:10.535) 0:00:28.900 ********** 2025-03-23 21:55:59.822883 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:55:59.822896 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:55:59.822910 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:55:59.822924 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:55:59.822938 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:55:59.822952 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:55:59.822966 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:59.822980 | orchestrator | 2025-03-23 21:55:59.822994 | orchestrator | TASK [osism.services.netdata : Include config tasks] *************************** 2025-03-23 21:55:59.823008 | orchestrator | Sunday 23 March 2025 21:55:14 +0000 (0:00:19.600) 0:00:48.501 ********** 2025-03-23 21:55:59.823023 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/config.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-23 21:55:59.823042 | orchestrator | 2025-03-23 21:55:59.823056 | orchestrator | TASK [osism.services.netdata : Copy configuration files] *********************** 2025-03-23 21:55:59.823070 | orchestrator | Sunday 23 March 2025 21:55:18 +0000 (0:00:03.592) 0:00:52.094 ********** 2025-03-23 21:55:59.823084 | orchestrator | changed: [testbed-node-0] => (item=netdata.conf) 2025-03-23 21:55:59.823098 | orchestrator | changed: [testbed-manager] => (item=netdata.conf) 2025-03-23 21:55:59.823112 | orchestrator | changed: [testbed-node-2] => (item=netdata.conf) 2025-03-23 21:55:59.823126 | orchestrator | changed: [testbed-node-1] => (item=netdata.conf) 2025-03-23 21:55:59.823140 | orchestrator | changed: [testbed-node-4] => (item=netdata.conf) 2025-03-23 21:55:59.823154 | orchestrator | changed: [testbed-node-3] => (item=netdata.conf) 2025-03-23 21:55:59.823169 | orchestrator | changed: [testbed-node-5] => (item=netdata.conf) 2025-03-23 21:55:59.823183 | orchestrator | changed: [testbed-node-0] => (item=stream.conf) 2025-03-23 21:55:59.823197 | orchestrator | changed: [testbed-node-5] => (item=stream.conf) 2025-03-23 21:55:59.823211 | orchestrator | changed: [testbed-node-4] => (item=stream.conf) 2025-03-23 21:55:59.823232 | orchestrator | changed: [testbed-manager] => (item=stream.conf) 2025-03-23 21:55:59.823246 | orchestrator | changed: [testbed-node-3] => (item=stream.conf) 2025-03-23 21:55:59.823260 | orchestrator | changed: [testbed-node-2] => (item=stream.conf) 2025-03-23 21:55:59.823274 | orchestrator | changed: [testbed-node-1] => (item=stream.conf) 2025-03-23 21:55:59.823287 | orchestrator | 2025-03-23 21:55:59.823301 | orchestrator | TASK [osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status] *** 2025-03-23 21:55:59.823317 | orchestrator | Sunday 23 March 2025 21:55:31 +0000 (0:00:13.234) 0:01:05.328 ********** 2025-03-23 21:55:59.823332 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:59.823346 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:55:59.823360 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:55:59.823374 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:55:59.823388 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:55:59.823402 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:55:59.823416 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:55:59.823429 | orchestrator | 2025-03-23 21:55:59.823444 | orchestrator | TASK [osism.services.netdata : Opt out from anonymous statistics] ************** 2025-03-23 21:55:59.823458 | orchestrator | Sunday 23 March 2025 21:55:34 +0000 (0:00:03.051) 0:01:08.380 ********** 2025-03-23 21:55:59.823472 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:55:59.823486 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:59.823500 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:55:59.823514 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:55:59.823528 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:55:59.823542 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:55:59.823572 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:55:59.823587 | orchestrator | 2025-03-23 21:55:59.823601 | orchestrator | TASK [osism.services.netdata : Add netdata user to docker group] *************** 2025-03-23 21:55:59.823615 | orchestrator | Sunday 23 March 2025 21:55:37 +0000 (0:00:02.808) 0:01:11.189 ********** 2025-03-23 21:55:59.823629 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:55:59.823643 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:59.823657 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:55:59.823671 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:55:59.823685 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:55:59.823699 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:55:59.823712 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:55:59.823726 | orchestrator | 2025-03-23 21:55:59.823740 | orchestrator | TASK [osism.services.netdata : Manage service netdata] ************************* 2025-03-23 21:55:59.823754 | orchestrator | Sunday 23 March 2025 21:55:40 +0000 (0:00:02.860) 0:01:14.050 ********** 2025-03-23 21:55:59.823768 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:55:59.823782 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:55:59.823796 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:55:59.823810 | orchestrator | ok: [testbed-manager] 2025-03-23 21:55:59.823823 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:55:59.823844 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:55:59.835511 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:55:59.835708 | orchestrator | 2025-03-23 21:55:59.835732 | orchestrator | TASK [osism.services.netdata : Include host type specific tasks] *************** 2025-03-23 21:55:59.835748 | orchestrator | Sunday 23 March 2025 21:55:44 +0000 (0:00:04.386) 0:01:18.436 ********** 2025-03-23 21:55:59.835763 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/server.yml for testbed-manager 2025-03-23 21:55:59.835779 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/client.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-23 21:55:59.835794 | orchestrator | 2025-03-23 21:55:59.835809 | orchestrator | TASK [osism.services.netdata : Set sysctl vm.max_map_count parameter] ********** 2025-03-23 21:55:59.835823 | orchestrator | Sunday 23 March 2025 21:55:47 +0000 (0:00:02.730) 0:01:21.166 ********** 2025-03-23 21:55:59.835838 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:59.835880 | orchestrator | 2025-03-23 21:55:59.835895 | orchestrator | RUNNING HANDLER [osism.services.netdata : Restart service netdata] ************* 2025-03-23 21:55:59.835909 | orchestrator | Sunday 23 March 2025 21:55:51 +0000 (0:00:04.295) 0:01:25.462 ********** 2025-03-23 21:55:59.835923 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:55:59.835954 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:55:59.835970 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:55:59.835984 | orchestrator | changed: [testbed-manager] 2025-03-23 21:55:59.835998 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:55:59.836013 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:55:59.836027 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:55:59.836040 | orchestrator | 2025-03-23 21:55:59.836055 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:55:59.836069 | orchestrator | testbed-manager : ok=16  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:55:59.836085 | orchestrator | testbed-node-0 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:55:59.836106 | orchestrator | testbed-node-1 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:55:59.836120 | orchestrator | testbed-node-2 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:55:59.836134 | orchestrator | testbed-node-3 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:55:59.836148 | orchestrator | testbed-node-4 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:55:59.836162 | orchestrator | testbed-node-5 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:55:59.836176 | orchestrator | 2025-03-23 21:55:59.836190 | orchestrator | Sunday 23 March 2025 21:55:55 +0000 (0:00:03.938) 0:01:29.400 ********** 2025-03-23 21:55:59.836204 | orchestrator | =============================================================================== 2025-03-23 21:55:59.836219 | orchestrator | osism.services.netdata : Install package netdata ----------------------- 19.60s 2025-03-23 21:55:59.836233 | orchestrator | osism.services.netdata : Copy configuration files ---------------------- 13.23s 2025-03-23 21:55:59.836252 | orchestrator | osism.services.netdata : Add repository -------------------------------- 10.54s 2025-03-23 21:55:59.836266 | orchestrator | osism.services.netdata : Install apt-transport-https package ------------ 5.81s 2025-03-23 21:55:59.836280 | orchestrator | osism.services.netdata : Manage service netdata ------------------------- 4.39s 2025-03-23 21:55:59.836294 | orchestrator | osism.services.netdata : Set sysctl vm.max_map_count parameter ---------- 4.30s 2025-03-23 21:55:59.836308 | orchestrator | osism.services.netdata : Restart service netdata ------------------------ 3.94s 2025-03-23 21:55:59.836322 | orchestrator | osism.services.netdata : Remove old architecture-dependent repository --- 3.70s 2025-03-23 21:55:59.836336 | orchestrator | osism.services.netdata : Include distribution specific install tasks ---- 3.68s 2025-03-23 21:55:59.836350 | orchestrator | osism.services.netdata : Include config tasks --------------------------- 3.59s 2025-03-23 21:55:59.836364 | orchestrator | osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status --- 3.05s 2025-03-23 21:55:59.836378 | orchestrator | osism.services.netdata : Add netdata user to docker group --------------- 2.86s 2025-03-23 21:55:59.836392 | orchestrator | osism.services.netdata : Opt out from anonymous statistics -------------- 2.81s 2025-03-23 21:55:59.836406 | orchestrator | osism.services.netdata : Include host type specific tasks --------------- 2.73s 2025-03-23 21:55:59.836420 | orchestrator | osism.services.netdata : Add repository gpg key ------------------------- 2.71s 2025-03-23 21:55:59.836434 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.86s 2025-03-23 21:55:59.836456 | orchestrator | 2025-03-23 21:55:59 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:55:59.836470 | orchestrator | 2025-03-23 21:55:59 | INFO  | Task d4cd9e9e-91d0-4bcc-97d0-f8752e49094d is in state SUCCESS 2025-03-23 21:55:59.836500 | orchestrator | 2025-03-23 21:55:59 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:02.889815 | orchestrator | 2025-03-23 21:55:59 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:02.889944 | orchestrator | 2025-03-23 21:56:02 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:02.895595 | orchestrator | 2025-03-23 21:56:02 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:02.898916 | orchestrator | 2025-03-23 21:56:02 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:05.934675 | orchestrator | 2025-03-23 21:56:02 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:05.934819 | orchestrator | 2025-03-23 21:56:05 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:05.935163 | orchestrator | 2025-03-23 21:56:05 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:05.937374 | orchestrator | 2025-03-23 21:56:05 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:08.980147 | orchestrator | 2025-03-23 21:56:05 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:08.980295 | orchestrator | 2025-03-23 21:56:08 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:12.044285 | orchestrator | 2025-03-23 21:56:08 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:12.044400 | orchestrator | 2025-03-23 21:56:08 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:12.044419 | orchestrator | 2025-03-23 21:56:08 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:12.044452 | orchestrator | 2025-03-23 21:56:12 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:12.045549 | orchestrator | 2025-03-23 21:56:12 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:12.046354 | orchestrator | 2025-03-23 21:56:12 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:15.110742 | orchestrator | 2025-03-23 21:56:12 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:15.110872 | orchestrator | 2025-03-23 21:56:15 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:15.112151 | orchestrator | 2025-03-23 21:56:15 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:15.114841 | orchestrator | 2025-03-23 21:56:15 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:18.157163 | orchestrator | 2025-03-23 21:56:15 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:18.157295 | orchestrator | 2025-03-23 21:56:18 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:18.157373 | orchestrator | 2025-03-23 21:56:18 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:18.158514 | orchestrator | 2025-03-23 21:56:18 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:21.245544 | orchestrator | 2025-03-23 21:56:18 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:21.245722 | orchestrator | 2025-03-23 21:56:21 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:21.246422 | orchestrator | 2025-03-23 21:56:21 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:21.247265 | orchestrator | 2025-03-23 21:56:21 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:21.247705 | orchestrator | 2025-03-23 21:56:21 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:24.308046 | orchestrator | 2025-03-23 21:56:24 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:24.308788 | orchestrator | 2025-03-23 21:56:24 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:27.357862 | orchestrator | 2025-03-23 21:56:24 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:27.357978 | orchestrator | 2025-03-23 21:56:24 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:27.358014 | orchestrator | 2025-03-23 21:56:27 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:27.358868 | orchestrator | 2025-03-23 21:56:27 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:27.361838 | orchestrator | 2025-03-23 21:56:27 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:30.408848 | orchestrator | 2025-03-23 21:56:27 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:30.408980 | orchestrator | 2025-03-23 21:56:30 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:30.411609 | orchestrator | 2025-03-23 21:56:30 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:30.414409 | orchestrator | 2025-03-23 21:56:30 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:33.448364 | orchestrator | 2025-03-23 21:56:30 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:33.448492 | orchestrator | 2025-03-23 21:56:33 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:33.448623 | orchestrator | 2025-03-23 21:56:33 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:33.449349 | orchestrator | 2025-03-23 21:56:33 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:33.449799 | orchestrator | 2025-03-23 21:56:33 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:36.514810 | orchestrator | 2025-03-23 21:56:36 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:36.519077 | orchestrator | 2025-03-23 21:56:36 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:36.520801 | orchestrator | 2025-03-23 21:56:36 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:39.561936 | orchestrator | 2025-03-23 21:56:36 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:39.562138 | orchestrator | 2025-03-23 21:56:39 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:39.562551 | orchestrator | 2025-03-23 21:56:39 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:39.569081 | orchestrator | 2025-03-23 21:56:39 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:42.617706 | orchestrator | 2025-03-23 21:56:39 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:42.617845 | orchestrator | 2025-03-23 21:56:42 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:42.619879 | orchestrator | 2025-03-23 21:56:42 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:42.622183 | orchestrator | 2025-03-23 21:56:42 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:45.686352 | orchestrator | 2025-03-23 21:56:42 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:45.686493 | orchestrator | 2025-03-23 21:56:45 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:45.686933 | orchestrator | 2025-03-23 21:56:45 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:45.689645 | orchestrator | 2025-03-23 21:56:45 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:45.691755 | orchestrator | 2025-03-23 21:56:45 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:48.733859 | orchestrator | 2025-03-23 21:56:48 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:48.735135 | orchestrator | 2025-03-23 21:56:48 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:48.735188 | orchestrator | 2025-03-23 21:56:48 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:51.782945 | orchestrator | 2025-03-23 21:56:48 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:51.783088 | orchestrator | 2025-03-23 21:56:51 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:51.785808 | orchestrator | 2025-03-23 21:56:51 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:51.787292 | orchestrator | 2025-03-23 21:56:51 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:51.787605 | orchestrator | 2025-03-23 21:56:51 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:54.832437 | orchestrator | 2025-03-23 21:56:54 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:54.832678 | orchestrator | 2025-03-23 21:56:54 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:54.833237 | orchestrator | 2025-03-23 21:56:54 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:56:54.833834 | orchestrator | 2025-03-23 21:56:54 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:56:57.895388 | orchestrator | 2025-03-23 21:56:57 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:56:57.898755 | orchestrator | 2025-03-23 21:56:57 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:56:57.898798 | orchestrator | 2025-03-23 21:56:57 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:57:00.970070 | orchestrator | 2025-03-23 21:56:57 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:00.970204 | orchestrator | 2025-03-23 21:57:00 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:00.972367 | orchestrator | 2025-03-23 21:57:00 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:00.973723 | orchestrator | 2025-03-23 21:57:00 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state STARTED 2025-03-23 21:57:04.034277 | orchestrator | 2025-03-23 21:57:00 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:04.034410 | orchestrator | 2025-03-23 21:57:04 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:04.034960 | orchestrator | 2025-03-23 21:57:04 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:04.044299 | orchestrator | 2025-03-23 21:57:04 | INFO  | Task 8ad7ed55-5dba-47ef-a4f0-b80146615b97 is in state STARTED 2025-03-23 21:57:04.045878 | orchestrator | 2025-03-23 21:57:04 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:04.045896 | orchestrator | 2025-03-23 21:57:04 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:04.045910 | orchestrator | 2025-03-23 21:57:04 | INFO  | Task 366632e9-2a23-447a-9f4d-986716d415f3 is in state SUCCESS 2025-03-23 21:57:04.048363 | orchestrator | 2025-03-23 21:57:04.048402 | orchestrator | 2025-03-23 21:57:04.048434 | orchestrator | PLAY [Apply role common] ******************************************************* 2025-03-23 21:57:04.048446 | orchestrator | 2025-03-23 21:57:04.048456 | orchestrator | TASK [common : include_tasks] ************************************************** 2025-03-23 21:57:04.048466 | orchestrator | Sunday 23 March 2025 21:54:19 +0000 (0:00:00.381) 0:00:00.381 ********** 2025-03-23 21:57:04.048476 | orchestrator | included: /ansible/roles/common/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-23 21:57:04.048487 | orchestrator | 2025-03-23 21:57:04.048497 | orchestrator | TASK [common : Ensuring config directories exist] ****************************** 2025-03-23 21:57:04.048507 | orchestrator | Sunday 23 March 2025 21:54:21 +0000 (0:00:01.963) 0:00:02.344 ********** 2025-03-23 21:57:04.048516 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-23 21:57:04.048526 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-23 21:57:04.048536 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-23 21:57:04.048545 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-23 21:57:04.048575 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-23 21:57:04.048585 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-23 21:57:04.048594 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-23 21:57:04.048604 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-23 21:57:04.048614 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-23 21:57:04.048644 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-23 21:57:04.048656 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-23 21:57:04.048699 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-23 21:57:04.048710 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-23 21:57:04.048725 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-23 21:57:04.048734 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-23 21:57:04.048744 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-23 21:57:04.048753 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-23 21:57:04.048763 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-23 21:57:04.048772 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-23 21:57:04.048782 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-23 21:57:04.048791 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-23 21:57:04.048801 | orchestrator | 2025-03-23 21:57:04.048813 | orchestrator | TASK [common : include_tasks] ************************************************** 2025-03-23 21:57:04.048834 | orchestrator | Sunday 23 March 2025 21:54:27 +0000 (0:00:05.672) 0:00:08.017 ********** 2025-03-23 21:57:04.048843 | orchestrator | included: /ansible/roles/common/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-23 21:57:04.048858 | orchestrator | 2025-03-23 21:57:04.048868 | orchestrator | TASK [service-cert-copy : common | Copying over extra CA certificates] ********* 2025-03-23 21:57:04.048877 | orchestrator | Sunday 23 March 2025 21:54:29 +0000 (0:00:02.437) 0:00:10.455 ********** 2025-03-23 21:57:04.048890 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.048906 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.048925 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.048937 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.048948 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.048960 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.048971 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.048987 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.048999 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049016 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049029 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049040 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049051 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049067 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049081 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049093 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049115 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049126 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049138 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049149 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049160 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.049176 | orchestrator | 2025-03-23 21:57:04.049187 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS certificate] *** 2025-03-23 21:57:04.049198 | orchestrator | Sunday 23 March 2025 21:54:35 +0000 (0:00:05.865) 0:00:16.321 ********** 2025-03-23 21:57:04.049209 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.049221 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049235 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049251 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.049262 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049272 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049282 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:57:04.049293 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.049328 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049339 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049349 | orchestrator | skipping: [testbed-manager] 2025-03-23 21:57:04.049359 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.049369 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049392 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049403 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.049413 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049427 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049436 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:57:04.049447 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.049457 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049466 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049476 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:57:04.049486 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:57:04.049495 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:57:04.049514 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.049525 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049534 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049549 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:57:04.049581 | orchestrator | 2025-03-23 21:57:04.049592 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS key] ****** 2025-03-23 21:57:04.049601 | orchestrator | Sunday 23 March 2025 21:54:39 +0000 (0:00:04.077) 0:00:20.398 ********** 2025-03-23 21:57:04.049611 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.049621 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049631 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.049641 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.049655 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050363 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050393 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.050415 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050425 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050435 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:57:04.050445 | orchestrator | skipping: [testbed-manager] 2025-03-23 21:57:04.050455 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.050465 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050475 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050484 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:57:04.050502 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.050521 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050531 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050541 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:57:04.050550 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:57:04.050611 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.050622 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050632 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050641 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-23 21:57:04.050651 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:57:04.050666 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050681 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.050691 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:57:04.050700 | orchestrator | 2025-03-23 21:57:04.050710 | orchestrator | TASK [common : Copying over /run subdirectories conf] ************************** 2025-03-23 21:57:04.050721 | orchestrator | Sunday 23 March 2025 21:54:44 +0000 (0:00:04.408) 0:00:24.807 ********** 2025-03-23 21:57:04.050730 | orchestrator | skipping: [testbed-manager] 2025-03-23 21:57:04.050739 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:57:04.050749 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:57:04.050758 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:57:04.050767 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:57:04.050777 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:57:04.050786 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:57:04.050795 | orchestrator | 2025-03-23 21:57:04.050805 | orchestrator | TASK [common : Restart systemd-tmpfiles] *************************************** 2025-03-23 21:57:04.050814 | orchestrator | Sunday 23 March 2025 21:54:45 +0000 (0:00:00.865) 0:00:25.672 ********** 2025-03-23 21:57:04.050824 | orchestrator | skipping: [testbed-manager] 2025-03-23 21:57:04.050833 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:57:04.050842 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:57:04.050852 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:57:04.050861 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:57:04.050870 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:57:04.050880 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:57:04.050889 | orchestrator | 2025-03-23 21:57:04.050899 | orchestrator | TASK [common : Ensure fluentd image is present for label check] **************** 2025-03-23 21:57:04.050908 | orchestrator | Sunday 23 March 2025 21:54:46 +0000 (0:00:01.292) 0:00:26.964 ********** 2025-03-23 21:57:04.050917 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:57:04.050928 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:04.050938 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:04.050947 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:57:04.050956 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:57:04.050967 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:57:04.050977 | orchestrator | changed: [testbed-manager] 2025-03-23 21:57:04.050987 | orchestrator | 2025-03-23 21:57:04.050998 | orchestrator | TASK [common : Fetch fluentd Docker image labels] ****************************** 2025-03-23 21:57:04.051008 | orchestrator | Sunday 23 March 2025 21:55:17 +0000 (0:00:31.523) 0:00:58.488 ********** 2025-03-23 21:57:04.051019 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:57:04.051030 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:57:04.051040 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:57:04.051051 | orchestrator | ok: [testbed-manager] 2025-03-23 21:57:04.051062 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:57:04.051072 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:57:04.051082 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:57:04.051093 | orchestrator | 2025-03-23 21:57:04.051103 | orchestrator | TASK [common : Set fluentd facts] ********************************************** 2025-03-23 21:57:04.051114 | orchestrator | Sunday 23 March 2025 21:55:22 +0000 (0:00:04.698) 0:01:03.187 ********** 2025-03-23 21:57:04.051125 | orchestrator | ok: [testbed-manager] 2025-03-23 21:57:04.051135 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:57:04.051150 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:57:04.051160 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:57:04.051171 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:57:04.051181 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:57:04.051195 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:57:04.051206 | orchestrator | 2025-03-23 21:57:04.051216 | orchestrator | TASK [common : Fetch fluentd Podman image labels] ****************************** 2025-03-23 21:57:04.051227 | orchestrator | Sunday 23 March 2025 21:55:24 +0000 (0:00:02.424) 0:01:05.612 ********** 2025-03-23 21:57:04.051237 | orchestrator | skipping: [testbed-manager] 2025-03-23 21:57:04.051248 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:57:04.051259 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:57:04.051270 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:57:04.051280 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:57:04.051291 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:57:04.051302 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:57:04.051312 | orchestrator | 2025-03-23 21:57:04.051323 | orchestrator | TASK [common : Set fluentd facts] ********************************************** 2025-03-23 21:57:04.051332 | orchestrator | Sunday 23 March 2025 21:55:27 +0000 (0:00:02.611) 0:01:08.223 ********** 2025-03-23 21:57:04.051342 | orchestrator | skipping: [testbed-manager] 2025-03-23 21:57:04.051351 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:57:04.051360 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:57:04.051369 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:57:04.051379 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:57:04.051388 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:57:04.051397 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:57:04.051407 | orchestrator | 2025-03-23 21:57:04.051416 | orchestrator | TASK [common : Copying over config.json files for services] ******************** 2025-03-23 21:57:04.051426 | orchestrator | Sunday 23 March 2025 21:55:29 +0000 (0:00:01.921) 0:01:10.144 ********** 2025-03-23 21:57:04.051439 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.051450 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.051463 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.051473 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051487 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.051497 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051507 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051517 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051530 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.051540 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051550 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051579 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.051595 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051608 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051618 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051632 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.051642 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051652 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051662 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051676 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051686 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.051696 | orchestrator | 2025-03-23 21:57:04.051705 | orchestrator | TASK [common : Find custom fluentd input config files] ************************* 2025-03-23 21:57:04.051715 | orchestrator | Sunday 23 March 2025 21:55:38 +0000 (0:00:08.675) 0:01:18.820 ********** 2025-03-23 21:57:04.051724 | orchestrator | [WARNING]: Skipped 2025-03-23 21:57:04.051734 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/input' path due 2025-03-23 21:57:04.051743 | orchestrator | to this access issue: 2025-03-23 21:57:04.051753 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/input' is not a 2025-03-23 21:57:04.051762 | orchestrator | directory 2025-03-23 21:57:04.051771 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-23 21:57:04.051781 | orchestrator | 2025-03-23 21:57:04.051790 | orchestrator | TASK [common : Find custom fluentd filter config files] ************************ 2025-03-23 21:57:04.051800 | orchestrator | Sunday 23 March 2025 21:55:39 +0000 (0:00:01.718) 0:01:20.539 ********** 2025-03-23 21:57:04.051809 | orchestrator | [WARNING]: Skipped 2025-03-23 21:57:04.051818 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/filter' path due 2025-03-23 21:57:04.051828 | orchestrator | to this access issue: 2025-03-23 21:57:04.051838 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/filter' is not a 2025-03-23 21:57:04.051847 | orchestrator | directory 2025-03-23 21:57:04.051856 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-23 21:57:04.051866 | orchestrator | 2025-03-23 21:57:04.051879 | orchestrator | TASK [common : Find custom fluentd format config files] ************************ 2025-03-23 21:57:04.051888 | orchestrator | Sunday 23 March 2025 21:55:40 +0000 (0:00:00.850) 0:01:21.389 ********** 2025-03-23 21:57:04.051898 | orchestrator | [WARNING]: Skipped 2025-03-23 21:57:04.051908 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/format' path due 2025-03-23 21:57:04.051917 | orchestrator | to this access issue: 2025-03-23 21:57:04.051927 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/format' is not a 2025-03-23 21:57:04.051936 | orchestrator | directory 2025-03-23 21:57:04.051946 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-23 21:57:04.051955 | orchestrator | 2025-03-23 21:57:04.051964 | orchestrator | TASK [common : Find custom fluentd output config files] ************************ 2025-03-23 21:57:04.051977 | orchestrator | Sunday 23 March 2025 21:55:41 +0000 (0:00:01.046) 0:01:22.436 ********** 2025-03-23 21:57:04.051987 | orchestrator | [WARNING]: Skipped 2025-03-23 21:57:04.051997 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/output' path due 2025-03-23 21:57:04.052006 | orchestrator | to this access issue: 2025-03-23 21:57:04.052016 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/output' is not a 2025-03-23 21:57:04.052032 | orchestrator | directory 2025-03-23 21:57:04.052042 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-23 21:57:04.052051 | orchestrator | 2025-03-23 21:57:04.052060 | orchestrator | TASK [common : Copying over td-agent.conf] ************************************* 2025-03-23 21:57:04.052070 | orchestrator | Sunday 23 March 2025 21:55:42 +0000 (0:00:00.895) 0:01:23.331 ********** 2025-03-23 21:57:04.052079 | orchestrator | changed: [testbed-manager] 2025-03-23 21:57:04.052088 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:04.052098 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:04.052107 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:04.052117 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:57:04.052126 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:57:04.052136 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:57:04.052145 | orchestrator | 2025-03-23 21:57:04.052155 | orchestrator | TASK [common : Copying over cron logrotate config file] ************************ 2025-03-23 21:57:04.052164 | orchestrator | Sunday 23 March 2025 21:55:49 +0000 (0:00:06.691) 0:01:30.023 ********** 2025-03-23 21:57:04.052174 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-23 21:57:04.052183 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-23 21:57:04.052193 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-23 21:57:04.052202 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-23 21:57:04.052212 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-23 21:57:04.052221 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-23 21:57:04.052230 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-23 21:57:04.052240 | orchestrator | 2025-03-23 21:57:04.052249 | orchestrator | TASK [common : Ensure RabbitMQ Erlang cookie exists] *************************** 2025-03-23 21:57:04.052259 | orchestrator | Sunday 23 March 2025 21:55:54 +0000 (0:00:05.360) 0:01:35.383 ********** 2025-03-23 21:57:04.052268 | orchestrator | changed: [testbed-manager] 2025-03-23 21:57:04.052278 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:04.052287 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:04.052296 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:04.052306 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:57:04.052315 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:57:04.052325 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:57:04.052334 | orchestrator | 2025-03-23 21:57:04.052343 | orchestrator | TASK [common : Ensuring config directories have correct owner and permission] *** 2025-03-23 21:57:04.052353 | orchestrator | Sunday 23 March 2025 21:55:58 +0000 (0:00:03.963) 0:01:39.347 ********** 2025-03-23 21:57:04.052365 | orchestrator | ok: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052375 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.052389 | orchestrator | ok: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052404 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.052415 | orchestrator | ok: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052429 | orchestrator | ok: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052439 | orchestrator | ok: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052449 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.052462 | orchestrator | ok: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052472 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.052492 | orchestrator | ok: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052502 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.052512 | orchestrator | ok: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052522 | orchestrator | ok: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052532 | orchestrator | ok: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052542 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.052551 | orchestrator | ok: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052582 | orchestrator | ok: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052599 | orchestrator | ok: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052610 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 21:57:04.052620 | orchestrator | ok: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052629 | orchestrator | 2025-03-23 21:57:04.052639 | orchestrator | TASK [common : Copy rabbitmq-env.conf to kolla toolbox] ************************ 2025-03-23 21:57:04.052649 | orchestrator | Sunday 23 March 2025 21:56:02 +0000 (0:00:03.528) 0:01:42.875 ********** 2025-03-23 21:57:04.052658 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-23 21:57:04.052668 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-23 21:57:04.052677 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-23 21:57:04.052687 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-23 21:57:04.052696 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-23 21:57:04.052706 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-23 21:57:04.052715 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-23 21:57:04.052725 | orchestrator | 2025-03-23 21:57:04.052734 | orchestrator | TASK [common : Copy rabbitmq erl_inetrc to kolla toolbox] ********************** 2025-03-23 21:57:04.052744 | orchestrator | Sunday 23 March 2025 21:56:05 +0000 (0:00:03.102) 0:01:45.978 ********** 2025-03-23 21:57:04.052753 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-23 21:57:04.052763 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-23 21:57:04.052777 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-23 21:57:04.052786 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-23 21:57:04.052796 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-23 21:57:04.052805 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-23 21:57:04.052814 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-23 21:57:04.052824 | orchestrator | 2025-03-23 21:57:04.052833 | orchestrator | TASK [common : Check common containers] **************************************** 2025-03-23 21:57:04.052843 | orchestrator | Sunday 23 March 2025 21:56:08 +0000 (0:00:03.339) 0:01:49.317 ********** 2025-03-23 21:57:04.052852 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052862 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052876 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052886 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052930 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052941 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.052955 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052965 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052975 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052989 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.052999 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.053009 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.053019 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-23 21:57:04.053033 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.053042 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.053052 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.053062 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.053076 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.053087 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.053097 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.053106 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 21:57:04.053120 | orchestrator | 2025-03-23 21:57:04.053129 | orchestrator | TASK [common : Creating log volume] ******************************************** 2025-03-23 21:57:04.053139 | orchestrator | Sunday 23 March 2025 21:56:13 +0000 (0:00:04.937) 0:01:54.255 ********** 2025-03-23 21:57:04.053148 | orchestrator | changed: [testbed-manager] 2025-03-23 21:57:04.053158 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:04.053167 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:04.053176 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:04.053186 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:57:04.053195 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:57:04.053204 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:57:04.053214 | orchestrator | 2025-03-23 21:57:04.053226 | orchestrator | TASK [common : Link kolla_logs volume to /var/log/kolla] *********************** 2025-03-23 21:57:04.053236 | orchestrator | Sunday 23 March 2025 21:56:15 +0000 (0:00:02.256) 0:01:56.511 ********** 2025-03-23 21:57:04.053245 | orchestrator | changed: [testbed-manager] 2025-03-23 21:57:04.053255 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:04.053264 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:04.053276 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:04.053286 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:57:04.053295 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:57:04.053305 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:57:04.053314 | orchestrator | 2025-03-23 21:57:04.053324 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-23 21:57:04.053333 | orchestrator | Sunday 23 March 2025 21:56:17 +0000 (0:00:01.581) 0:01:58.093 ********** 2025-03-23 21:57:04.053342 | orchestrator | 2025-03-23 21:57:04.053352 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-23 21:57:04.053361 | orchestrator | Sunday 23 March 2025 21:56:17 +0000 (0:00:00.059) 0:01:58.152 ********** 2025-03-23 21:57:04.053371 | orchestrator | 2025-03-23 21:57:04.053380 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-23 21:57:04.053390 | orchestrator | Sunday 23 March 2025 21:56:17 +0000 (0:00:00.054) 0:01:58.207 ********** 2025-03-23 21:57:04.053399 | orchestrator | 2025-03-23 21:57:04.053408 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-23 21:57:04.053418 | orchestrator | Sunday 23 March 2025 21:56:17 +0000 (0:00:00.093) 0:01:58.301 ********** 2025-03-23 21:57:04.053427 | orchestrator | 2025-03-23 21:57:04.053437 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-23 21:57:04.053446 | orchestrator | Sunday 23 March 2025 21:56:17 +0000 (0:00:00.262) 0:01:58.563 ********** 2025-03-23 21:57:04.053455 | orchestrator | 2025-03-23 21:57:04.053465 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-23 21:57:04.053474 | orchestrator | Sunday 23 March 2025 21:56:17 +0000 (0:00:00.056) 0:01:58.620 ********** 2025-03-23 21:57:04.053484 | orchestrator | 2025-03-23 21:57:04.053493 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-23 21:57:04.053502 | orchestrator | Sunday 23 March 2025 21:56:18 +0000 (0:00:00.054) 0:01:58.675 ********** 2025-03-23 21:57:04.053512 | orchestrator | 2025-03-23 21:57:04.053521 | orchestrator | RUNNING HANDLER [common : Restart fluentd container] *************************** 2025-03-23 21:57:04.053531 | orchestrator | Sunday 23 March 2025 21:56:18 +0000 (0:00:00.073) 0:01:58.748 ********** 2025-03-23 21:57:04.053540 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:04.053565 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:04.053576 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:04.053585 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:57:04.053595 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:57:04.053609 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:57:04.053618 | orchestrator | changed: [testbed-manager] 2025-03-23 21:57:04.053628 | orchestrator | 2025-03-23 21:57:04.053637 | orchestrator | RUNNING HANDLER [common : Restart kolla-toolbox container] ********************* 2025-03-23 21:57:04.053647 | orchestrator | Sunday 23 March 2025 21:56:27 +0000 (0:00:09.700) 0:02:08.448 ********** 2025-03-23 21:57:04.053656 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:04.053665 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:57:04.053675 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:57:04.053684 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:04.053694 | orchestrator | changed: [testbed-manager] 2025-03-23 21:57:04.053703 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:57:04.053712 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:04.053722 | orchestrator | 2025-03-23 21:57:04.053731 | orchestrator | RUNNING HANDLER [common : Initializing toolbox container using normal user] **** 2025-03-23 21:57:04.053740 | orchestrator | Sunday 23 March 2025 21:56:52 +0000 (0:00:24.885) 0:02:33.334 ********** 2025-03-23 21:57:04.053750 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:57:04.053759 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:57:04.053769 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:57:04.053778 | orchestrator | ok: [testbed-manager] 2025-03-23 21:57:04.053788 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:57:04.053797 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:57:04.053806 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:57:04.053816 | orchestrator | 2025-03-23 21:57:04.053825 | orchestrator | RUNNING HANDLER [common : Restart cron container] ****************************** 2025-03-23 21:57:04.053835 | orchestrator | Sunday 23 March 2025 21:56:55 +0000 (0:00:02.965) 0:02:36.299 ********** 2025-03-23 21:57:04.053844 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:04.053853 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:04.053863 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:04.053872 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:57:04.053881 | orchestrator | changed: [testbed-manager] 2025-03-23 21:57:04.053890 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:57:04.053900 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:57:04.053909 | orchestrator | 2025-03-23 21:57:04.053919 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:57:04.053929 | orchestrator | testbed-manager : ok=25  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-23 21:57:04.053939 | orchestrator | testbed-node-0 : ok=21  changed=14  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-23 21:57:04.053949 | orchestrator | testbed-node-1 : ok=21  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-23 21:57:04.053959 | orchestrator | testbed-node-2 : ok=21  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-23 21:57:04.053968 | orchestrator | testbed-node-3 : ok=21  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-23 21:57:04.053977 | orchestrator | testbed-node-4 : ok=21  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-23 21:57:04.053987 | orchestrator | testbed-node-5 : ok=21  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-23 21:57:04.053996 | orchestrator | 2025-03-23 21:57:04.054006 | orchestrator | 2025-03-23 21:57:04.054049 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-23 21:57:04.054061 | orchestrator | Sunday 23 March 2025 21:57:01 +0000 (0:00:06.288) 0:02:42.588 ********** 2025-03-23 21:57:04.054070 | orchestrator | =============================================================================== 2025-03-23 21:57:04.054085 | orchestrator | common : Ensure fluentd image is present for label check --------------- 31.52s 2025-03-23 21:57:04.054094 | orchestrator | common : Restart kolla-toolbox container ------------------------------- 24.89s 2025-03-23 21:57:04.054104 | orchestrator | common : Restart fluentd container -------------------------------------- 9.70s 2025-03-23 21:57:04.054113 | orchestrator | common : Copying over config.json files for services -------------------- 8.68s 2025-03-23 21:57:04.054126 | orchestrator | common : Copying over td-agent.conf ------------------------------------- 6.69s 2025-03-23 21:57:04.054136 | orchestrator | common : Restart cron container ----------------------------------------- 6.29s 2025-03-23 21:57:04.054145 | orchestrator | service-cert-copy : common | Copying over extra CA certificates --------- 5.87s 2025-03-23 21:57:04.054154 | orchestrator | common : Ensuring config directories exist ------------------------------ 5.67s 2025-03-23 21:57:04.054164 | orchestrator | common : Copying over cron logrotate config file ------------------------ 5.36s 2025-03-23 21:57:04.054174 | orchestrator | common : Check common containers ---------------------------------------- 4.94s 2025-03-23 21:57:04.054183 | orchestrator | common : Fetch fluentd Docker image labels ------------------------------ 4.70s 2025-03-23 21:57:04.054193 | orchestrator | service-cert-copy : common | Copying over backend internal TLS key ------ 4.41s 2025-03-23 21:57:04.054202 | orchestrator | service-cert-copy : common | Copying over backend internal TLS certificate --- 4.08s 2025-03-23 21:57:04.054212 | orchestrator | common : Ensure RabbitMQ Erlang cookie exists --------------------------- 3.96s 2025-03-23 21:57:04.054226 | orchestrator | common : Ensuring config directories have correct owner and permission --- 3.53s 2025-03-23 21:57:07.092995 | orchestrator | common : Copy rabbitmq erl_inetrc to kolla toolbox ---------------------- 3.34s 2025-03-23 21:57:07.093122 | orchestrator | common : Copy rabbitmq-env.conf to kolla toolbox ------------------------ 3.10s 2025-03-23 21:57:07.093141 | orchestrator | common : Initializing toolbox container using normal user --------------- 2.97s 2025-03-23 21:57:07.093157 | orchestrator | common : Fetch fluentd Podman image labels ------------------------------ 2.61s 2025-03-23 21:57:07.093172 | orchestrator | common : include_tasks -------------------------------------------------- 2.44s 2025-03-23 21:57:07.093188 | orchestrator | 2025-03-23 21:57:04 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:07.093203 | orchestrator | 2025-03-23 21:57:04 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:07.093234 | orchestrator | 2025-03-23 21:57:07 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:07.093839 | orchestrator | 2025-03-23 21:57:07 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:07.094127 | orchestrator | 2025-03-23 21:57:07 | INFO  | Task 8ad7ed55-5dba-47ef-a4f0-b80146615b97 is in state STARTED 2025-03-23 21:57:07.094154 | orchestrator | 2025-03-23 21:57:07 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:07.096804 | orchestrator | 2025-03-23 21:57:07 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:10.147213 | orchestrator | 2025-03-23 21:57:07 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:10.147335 | orchestrator | 2025-03-23 21:57:07 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:10.147370 | orchestrator | 2025-03-23 21:57:10 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:10.147537 | orchestrator | 2025-03-23 21:57:10 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:10.147972 | orchestrator | 2025-03-23 21:57:10 | INFO  | Task 8ad7ed55-5dba-47ef-a4f0-b80146615b97 is in state STARTED 2025-03-23 21:57:10.148619 | orchestrator | 2025-03-23 21:57:10 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:10.149206 | orchestrator | 2025-03-23 21:57:10 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:10.149797 | orchestrator | 2025-03-23 21:57:10 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:13.191612 | orchestrator | 2025-03-23 21:57:10 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:13.191805 | orchestrator | 2025-03-23 21:57:13 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:13.191898 | orchestrator | 2025-03-23 21:57:13 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:13.193677 | orchestrator | 2025-03-23 21:57:13 | INFO  | Task 8ad7ed55-5dba-47ef-a4f0-b80146615b97 is in state STARTED 2025-03-23 21:57:13.195924 | orchestrator | 2025-03-23 21:57:13 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:13.196213 | orchestrator | 2025-03-23 21:57:13 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:13.203134 | orchestrator | 2025-03-23 21:57:13 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:16.267650 | orchestrator | 2025-03-23 21:57:13 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:16.267786 | orchestrator | 2025-03-23 21:57:16 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:16.268471 | orchestrator | 2025-03-23 21:57:16 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:16.268510 | orchestrator | 2025-03-23 21:57:16 | INFO  | Task 8ad7ed55-5dba-47ef-a4f0-b80146615b97 is in state STARTED 2025-03-23 21:57:16.272715 | orchestrator | 2025-03-23 21:57:16 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:16.273220 | orchestrator | 2025-03-23 21:57:16 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:16.273875 | orchestrator | 2025-03-23 21:57:16 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:16.276998 | orchestrator | 2025-03-23 21:57:16 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:19.313784 | orchestrator | 2025-03-23 21:57:19 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:19.318962 | orchestrator | 2025-03-23 21:57:19 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:19.320036 | orchestrator | 2025-03-23 21:57:19 | INFO  | Task 8ad7ed55-5dba-47ef-a4f0-b80146615b97 is in state STARTED 2025-03-23 21:57:19.323864 | orchestrator | 2025-03-23 21:57:19 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:19.327964 | orchestrator | 2025-03-23 21:57:19 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:19.327996 | orchestrator | 2025-03-23 21:57:19 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:22.374726 | orchestrator | 2025-03-23 21:57:19 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:22.374853 | orchestrator | 2025-03-23 21:57:22 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:22.375398 | orchestrator | 2025-03-23 21:57:22 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:22.375431 | orchestrator | 2025-03-23 21:57:22 | INFO  | Task 8ad7ed55-5dba-47ef-a4f0-b80146615b97 is in state STARTED 2025-03-23 21:57:22.377456 | orchestrator | 2025-03-23 21:57:22 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:25.426714 | orchestrator | 2025-03-23 21:57:22 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:25.426828 | orchestrator | 2025-03-23 21:57:22 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:25.426847 | orchestrator | 2025-03-23 21:57:22 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:25.426879 | orchestrator | 2025-03-23 21:57:25 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:25.428395 | orchestrator | 2025-03-23 21:57:25 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:25.431593 | orchestrator | 2025-03-23 21:57:25 | INFO  | Task 8ad7ed55-5dba-47ef-a4f0-b80146615b97 is in state STARTED 2025-03-23 21:57:25.433993 | orchestrator | 2025-03-23 21:57:25 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:25.437036 | orchestrator | 2025-03-23 21:57:25 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:25.439259 | orchestrator | 2025-03-23 21:57:25 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:28.488501 | orchestrator | 2025-03-23 21:57:25 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:28.488685 | orchestrator | 2025-03-23 21:57:28 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:28.490735 | orchestrator | 2025-03-23 21:57:28 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:28.491945 | orchestrator | 2025-03-23 21:57:28 | INFO  | Task 8ad7ed55-5dba-47ef-a4f0-b80146615b97 is in state STARTED 2025-03-23 21:57:28.493646 | orchestrator | 2025-03-23 21:57:28 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:28.494221 | orchestrator | 2025-03-23 21:57:28 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:28.494691 | orchestrator | 2025-03-23 21:57:28 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:28.495076 | orchestrator | 2025-03-23 21:57:28 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:31.580511 | orchestrator | 2025-03-23 21:57:31 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:31.582179 | orchestrator | 2025-03-23 21:57:31 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:31.584674 | orchestrator | 2025-03-23 21:57:31 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:57:31.588435 | orchestrator | 2025-03-23 21:57:31 | INFO  | Task 8ad7ed55-5dba-47ef-a4f0-b80146615b97 is in state SUCCESS 2025-03-23 21:57:31.591870 | orchestrator | 2025-03-23 21:57:31 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:31.594100 | orchestrator | 2025-03-23 21:57:31 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:31.597363 | orchestrator | 2025-03-23 21:57:31 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:34.649955 | orchestrator | 2025-03-23 21:57:31 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:34.650147 | orchestrator | 2025-03-23 21:57:34 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:34.652805 | orchestrator | 2025-03-23 21:57:34 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:34.652838 | orchestrator | 2025-03-23 21:57:34 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:57:34.658680 | orchestrator | 2025-03-23 21:57:34 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:34.659845 | orchestrator | 2025-03-23 21:57:34 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:34.659875 | orchestrator | 2025-03-23 21:57:34 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:34.660546 | orchestrator | 2025-03-23 21:57:34 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:37.716290 | orchestrator | 2025-03-23 21:57:37 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:37.717645 | orchestrator | 2025-03-23 21:57:37 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:37.719828 | orchestrator | 2025-03-23 21:57:37 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:57:37.720732 | orchestrator | 2025-03-23 21:57:37 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:37.722402 | orchestrator | 2025-03-23 21:57:37 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:37.723581 | orchestrator | 2025-03-23 21:57:37 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:40.802799 | orchestrator | 2025-03-23 21:57:37 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:40.802946 | orchestrator | 2025-03-23 21:57:40 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:40.806299 | orchestrator | 2025-03-23 21:57:40 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:40.811917 | orchestrator | 2025-03-23 21:57:40 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:57:40.817230 | orchestrator | 2025-03-23 21:57:40 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:40.823975 | orchestrator | 2025-03-23 21:57:40 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:40.826396 | orchestrator | 2025-03-23 21:57:40 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:43.867026 | orchestrator | 2025-03-23 21:57:40 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:43.867164 | orchestrator | 2025-03-23 21:57:43 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:43.867249 | orchestrator | 2025-03-23 21:57:43 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:43.867700 | orchestrator | 2025-03-23 21:57:43 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:57:43.868186 | orchestrator | 2025-03-23 21:57:43 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:43.868730 | orchestrator | 2025-03-23 21:57:43 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:43.869422 | orchestrator | 2025-03-23 21:57:43 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:46.913988 | orchestrator | 2025-03-23 21:57:43 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:46.914182 | orchestrator | 2025-03-23 21:57:46 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:46.915116 | orchestrator | 2025-03-23 21:57:46 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:46.915149 | orchestrator | 2025-03-23 21:57:46 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:57:46.915984 | orchestrator | 2025-03-23 21:57:46 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:46.916827 | orchestrator | 2025-03-23 21:57:46 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:46.917751 | orchestrator | 2025-03-23 21:57:46 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state STARTED 2025-03-23 21:57:49.966692 | orchestrator | 2025-03-23 21:57:46 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:49.966832 | orchestrator | 2025-03-23 21:57:49 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:49.967327 | orchestrator | 2025-03-23 21:57:49 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:49.968030 | orchestrator | 2025-03-23 21:57:49 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:57:49.971415 | orchestrator | 2025-03-23 21:57:49 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:49.972646 | orchestrator | 2025-03-23 21:57:49 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:49.972876 | orchestrator | 2025-03-23 21:57:49 | INFO  | Task 1340c20d-0f21-4fd8-8ddd-868f12a1e264 is in state SUCCESS 2025-03-23 21:57:49.973138 | orchestrator | 2025-03-23 21:57:49 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:49.974237 | orchestrator | 2025-03-23 21:57:49.974273 | orchestrator | 2025-03-23 21:57:49.974289 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-23 21:57:49.974305 | orchestrator | 2025-03-23 21:57:49.974320 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-23 21:57:49.974335 | orchestrator | Sunday 23 March 2025 21:57:08 +0000 (0:00:00.525) 0:00:00.525 ********** 2025-03-23 21:57:49.974350 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:57:49.974367 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:57:49.974382 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:57:49.974396 | orchestrator | 2025-03-23 21:57:49.974411 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-23 21:57:49.974426 | orchestrator | Sunday 23 March 2025 21:57:09 +0000 (0:00:00.780) 0:00:01.305 ********** 2025-03-23 21:57:49.974441 | orchestrator | ok: [testbed-node-0] => (item=enable_memcached_True) 2025-03-23 21:57:49.974456 | orchestrator | ok: [testbed-node-1] => (item=enable_memcached_True) 2025-03-23 21:57:49.974471 | orchestrator | ok: [testbed-node-2] => (item=enable_memcached_True) 2025-03-23 21:57:49.974485 | orchestrator | 2025-03-23 21:57:49.974499 | orchestrator | PLAY [Apply role memcached] **************************************************** 2025-03-23 21:57:49.974514 | orchestrator | 2025-03-23 21:57:49.974528 | orchestrator | TASK [memcached : include_tasks] *********************************************** 2025-03-23 21:57:49.974542 | orchestrator | Sunday 23 March 2025 21:57:09 +0000 (0:00:00.671) 0:00:01.976 ********** 2025-03-23 21:57:49.974585 | orchestrator | included: /ansible/roles/memcached/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 21:57:49.974601 | orchestrator | 2025-03-23 21:57:49.974615 | orchestrator | TASK [memcached : Ensuring config directories exist] *************************** 2025-03-23 21:57:49.974629 | orchestrator | Sunday 23 March 2025 21:57:11 +0000 (0:00:01.859) 0:00:03.836 ********** 2025-03-23 21:57:49.974644 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2025-03-23 21:57:49.974658 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2025-03-23 21:57:49.974671 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2025-03-23 21:57:49.974685 | orchestrator | 2025-03-23 21:57:49.974699 | orchestrator | TASK [memcached : Copying over config.json files for services] ***************** 2025-03-23 21:57:49.974713 | orchestrator | Sunday 23 March 2025 21:57:12 +0000 (0:00:01.203) 0:00:05.040 ********** 2025-03-23 21:57:49.974727 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2025-03-23 21:57:49.974741 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2025-03-23 21:57:49.974781 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2025-03-23 21:57:49.974795 | orchestrator | 2025-03-23 21:57:49.974809 | orchestrator | TASK [memcached : Check memcached container] *********************************** 2025-03-23 21:57:49.974823 | orchestrator | Sunday 23 March 2025 21:57:16 +0000 (0:00:03.467) 0:00:08.507 ********** 2025-03-23 21:57:49.974839 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:49.974871 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:49.974886 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:49.974902 | orchestrator | 2025-03-23 21:57:49.974917 | orchestrator | RUNNING HANDLER [memcached : Restart memcached container] ********************** 2025-03-23 21:57:49.974932 | orchestrator | Sunday 23 March 2025 21:57:20 +0000 (0:00:04.127) 0:00:12.634 ********** 2025-03-23 21:57:49.974947 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:49.974963 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:49.974978 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:49.974993 | orchestrator | 2025-03-23 21:57:49.975013 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:57:49.975029 | orchestrator | testbed-node-0 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:57:49.975046 | orchestrator | testbed-node-1 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:57:49.975062 | orchestrator | testbed-node-2 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:57:49.975077 | orchestrator | 2025-03-23 21:57:49.975092 | orchestrator | 2025-03-23 21:57:49.975108 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-23 21:57:49.975123 | orchestrator | Sunday 23 March 2025 21:57:28 +0000 (0:00:08.181) 0:00:20.816 ********** 2025-03-23 21:57:49.975139 | orchestrator | =============================================================================== 2025-03-23 21:57:49.975154 | orchestrator | memcached : Restart memcached container --------------------------------- 8.18s 2025-03-23 21:57:49.975170 | orchestrator | memcached : Check memcached container ----------------------------------- 4.13s 2025-03-23 21:57:49.975185 | orchestrator | memcached : Copying over config.json files for services ----------------- 3.47s 2025-03-23 21:57:49.975200 | orchestrator | memcached : include_tasks ----------------------------------------------- 1.86s 2025-03-23 21:57:49.975213 | orchestrator | memcached : Ensuring config directories exist --------------------------- 1.20s 2025-03-23 21:57:49.975227 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.78s 2025-03-23 21:57:49.975241 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.67s 2025-03-23 21:57:49.975255 | orchestrator | 2025-03-23 21:57:49.975269 | orchestrator | 2025-03-23 21:57:49.975283 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-23 21:57:49.975297 | orchestrator | 2025-03-23 21:57:49.975310 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-23 21:57:49.975324 | orchestrator | Sunday 23 March 2025 21:57:08 +0000 (0:00:00.762) 0:00:00.762 ********** 2025-03-23 21:57:49.975338 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:57:49.975352 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:57:49.975366 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:57:49.975380 | orchestrator | 2025-03-23 21:57:49.975394 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-23 21:57:49.975418 | orchestrator | Sunday 23 March 2025 21:57:09 +0000 (0:00:01.025) 0:00:01.788 ********** 2025-03-23 21:57:49.975433 | orchestrator | ok: [testbed-node-0] => (item=enable_redis_True) 2025-03-23 21:57:49.975447 | orchestrator | ok: [testbed-node-1] => (item=enable_redis_True) 2025-03-23 21:57:49.975461 | orchestrator | ok: [testbed-node-2] => (item=enable_redis_True) 2025-03-23 21:57:49.975475 | orchestrator | 2025-03-23 21:57:49.975489 | orchestrator | PLAY [Apply role redis] ******************************************************** 2025-03-23 21:57:49.975511 | orchestrator | 2025-03-23 21:57:49.975525 | orchestrator | TASK [redis : include_tasks] *************************************************** 2025-03-23 21:57:49.975539 | orchestrator | Sunday 23 March 2025 21:57:09 +0000 (0:00:00.542) 0:00:02.330 ********** 2025-03-23 21:57:49.975570 | orchestrator | included: /ansible/roles/redis/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 21:57:49.975586 | orchestrator | 2025-03-23 21:57:49.975600 | orchestrator | TASK [redis : Ensuring config directories exist] ******************************* 2025-03-23 21:57:49.975614 | orchestrator | Sunday 23 March 2025 21:57:11 +0000 (0:00:01.388) 0:00:03.718 ********** 2025-03-23 21:57:49.975631 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975651 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975666 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975681 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975696 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975725 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975747 | orchestrator | 2025-03-23 21:57:49.975762 | orchestrator | TASK [redis : Copying over default config.json files] ************************** 2025-03-23 21:57:49.975776 | orchestrator | Sunday 23 March 2025 21:57:13 +0000 (0:00:02.320) 0:00:06.039 ********** 2025-03-23 21:57:49.975790 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975805 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975820 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975835 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975850 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975872 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975891 | orchestrator | 2025-03-23 21:57:49.975905 | orchestrator | TASK [redis : Copying over redis config files] ********************************* 2025-03-23 21:57:49.975920 | orchestrator | Sunday 23 March 2025 21:57:17 +0000 (0:00:03.752) 0:00:09.791 ********** 2025-03-23 21:57:49.975934 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975948 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975963 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975978 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.975992 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.976020 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.976035 | orchestrator | 2025-03-23 21:57:49.976050 | orchestrator | TASK [redis : Check redis containers] ****************************************** 2025-03-23 21:57:49.976064 | orchestrator | Sunday 23 March 2025 21:57:21 +0000 (0:00:04.158) 0:00:13.950 ********** 2025-03-23 21:57:49.976078 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.976093 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.976108 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.976122 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.976137 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:49.976164 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-23 21:57:53.009688 | orchestrator | 2025-03-23 21:57:53.009777 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-03-23 21:57:53.009788 | orchestrator | Sunday 23 March 2025 21:57:24 +0000 (0:00:02.848) 0:00:16.799 ********** 2025-03-23 21:57:53.009795 | orchestrator | 2025-03-23 21:57:53.009802 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-03-23 21:57:53.009809 | orchestrator | Sunday 23 March 2025 21:57:24 +0000 (0:00:00.285) 0:00:17.085 ********** 2025-03-23 21:57:53.009816 | orchestrator | 2025-03-23 21:57:53.009823 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-03-23 21:57:53.009830 | orchestrator | Sunday 23 March 2025 21:57:24 +0000 (0:00:00.210) 0:00:17.296 ********** 2025-03-23 21:57:53.009836 | orchestrator | 2025-03-23 21:57:53.009843 | orchestrator | RUNNING HANDLER [redis : Restart redis container] ****************************** 2025-03-23 21:57:53.009850 | orchestrator | Sunday 23 March 2025 21:57:25 +0000 (0:00:00.237) 0:00:17.533 ********** 2025-03-23 21:57:53.009857 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:53.009864 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:53.009871 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:53.009878 | orchestrator | 2025-03-23 21:57:53.009885 | orchestrator | RUNNING HANDLER [redis : Restart redis-sentinel container] ********************* 2025-03-23 21:57:53.009892 | orchestrator | Sunday 23 March 2025 21:57:34 +0000 (0:00:09.687) 0:00:27.221 ********** 2025-03-23 21:57:53.009898 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:57:53.009905 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:57:53.009913 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:57:53.009920 | orchestrator | 2025-03-23 21:57:53.009926 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:57:53.009933 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:57:53.009941 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:57:53.009948 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 21:57:53.009955 | orchestrator | 2025-03-23 21:57:53.009961 | orchestrator | 2025-03-23 21:57:53.009968 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-23 21:57:53.009975 | orchestrator | Sunday 23 March 2025 21:57:46 +0000 (0:00:11.909) 0:00:39.130 ********** 2025-03-23 21:57:53.009981 | orchestrator | =============================================================================== 2025-03-23 21:57:53.009988 | orchestrator | redis : Restart redis-sentinel container ------------------------------- 11.91s 2025-03-23 21:57:53.009995 | orchestrator | redis : Restart redis container ----------------------------------------- 9.69s 2025-03-23 21:57:53.010001 | orchestrator | redis : Copying over redis config files --------------------------------- 4.16s 2025-03-23 21:57:53.010008 | orchestrator | redis : Copying over default config.json files -------------------------- 3.75s 2025-03-23 21:57:53.010073 | orchestrator | redis : Check redis containers ------------------------------------------ 2.85s 2025-03-23 21:57:53.010081 | orchestrator | redis : Ensuring config directories exist ------------------------------- 2.32s 2025-03-23 21:57:53.010088 | orchestrator | redis : include_tasks --------------------------------------------------- 1.39s 2025-03-23 21:57:53.010095 | orchestrator | Group hosts based on Kolla action --------------------------------------- 1.03s 2025-03-23 21:57:53.010101 | orchestrator | redis : Flush handlers -------------------------------------------------- 0.73s 2025-03-23 21:57:53.010108 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.54s 2025-03-23 21:57:53.010127 | orchestrator | 2025-03-23 21:57:53 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:53.011285 | orchestrator | 2025-03-23 21:57:53 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:53.011457 | orchestrator | 2025-03-23 21:57:53 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:57:53.012608 | orchestrator | 2025-03-23 21:57:53 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:53.014092 | orchestrator | 2025-03-23 21:57:53 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:53.014615 | orchestrator | 2025-03-23 21:57:53 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:56.065313 | orchestrator | 2025-03-23 21:57:56 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:56.065683 | orchestrator | 2025-03-23 21:57:56 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:56.067906 | orchestrator | 2025-03-23 21:57:56 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:57:56.072632 | orchestrator | 2025-03-23 21:57:56 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:57:59.129917 | orchestrator | 2025-03-23 21:57:56 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:57:59.130083 | orchestrator | 2025-03-23 21:57:56 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:57:59.130124 | orchestrator | 2025-03-23 21:57:59 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:57:59.133125 | orchestrator | 2025-03-23 21:57:59 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:57:59.133159 | orchestrator | 2025-03-23 21:57:59 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:57:59.140243 | orchestrator | 2025-03-23 21:57:59 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:02.189235 | orchestrator | 2025-03-23 21:57:59 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:02.189349 | orchestrator | 2025-03-23 21:57:59 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:02.189380 | orchestrator | 2025-03-23 21:58:02 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:02.192082 | orchestrator | 2025-03-23 21:58:02 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:02.194968 | orchestrator | 2025-03-23 21:58:02 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:02.197418 | orchestrator | 2025-03-23 21:58:02 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:02.199908 | orchestrator | 2025-03-23 21:58:02 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:02.200240 | orchestrator | 2025-03-23 21:58:02 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:05.256406 | orchestrator | 2025-03-23 21:58:05 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:05.256737 | orchestrator | 2025-03-23 21:58:05 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:05.258279 | orchestrator | 2025-03-23 21:58:05 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:05.259389 | orchestrator | 2025-03-23 21:58:05 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:05.260171 | orchestrator | 2025-03-23 21:58:05 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:05.260284 | orchestrator | 2025-03-23 21:58:05 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:08.305535 | orchestrator | 2025-03-23 21:58:08 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:08.307324 | orchestrator | 2025-03-23 21:58:08 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:08.308243 | orchestrator | 2025-03-23 21:58:08 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:08.309069 | orchestrator | 2025-03-23 21:58:08 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:08.310422 | orchestrator | 2025-03-23 21:58:08 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:08.311275 | orchestrator | 2025-03-23 21:58:08 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:11.369186 | orchestrator | 2025-03-23 21:58:11 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:11.369412 | orchestrator | 2025-03-23 21:58:11 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:11.369444 | orchestrator | 2025-03-23 21:58:11 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:11.370381 | orchestrator | 2025-03-23 21:58:11 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:11.371510 | orchestrator | 2025-03-23 21:58:11 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:14.420882 | orchestrator | 2025-03-23 21:58:11 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:14.421007 | orchestrator | 2025-03-23 21:58:14 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:14.421517 | orchestrator | 2025-03-23 21:58:14 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:14.422581 | orchestrator | 2025-03-23 21:58:14 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:14.422706 | orchestrator | 2025-03-23 21:58:14 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:14.423866 | orchestrator | 2025-03-23 21:58:14 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:17.467083 | orchestrator | 2025-03-23 21:58:14 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:17.467203 | orchestrator | 2025-03-23 21:58:17 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:17.467954 | orchestrator | 2025-03-23 21:58:17 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:17.468526 | orchestrator | 2025-03-23 21:58:17 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:17.471162 | orchestrator | 2025-03-23 21:58:17 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:17.473533 | orchestrator | 2025-03-23 21:58:17 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:17.473645 | orchestrator | 2025-03-23 21:58:17 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:20.515312 | orchestrator | 2025-03-23 21:58:20 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:20.518987 | orchestrator | 2025-03-23 21:58:20 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:23.555193 | orchestrator | 2025-03-23 21:58:20 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:23.555349 | orchestrator | 2025-03-23 21:58:20 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:23.555370 | orchestrator | 2025-03-23 21:58:20 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:23.555386 | orchestrator | 2025-03-23 21:58:20 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:23.555418 | orchestrator | 2025-03-23 21:58:23 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:23.555495 | orchestrator | 2025-03-23 21:58:23 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:23.558520 | orchestrator | 2025-03-23 21:58:23 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:23.562928 | orchestrator | 2025-03-23 21:58:23 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:23.563923 | orchestrator | 2025-03-23 21:58:23 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:23.563955 | orchestrator | 2025-03-23 21:58:23 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:26.608518 | orchestrator | 2025-03-23 21:58:26 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:26.610105 | orchestrator | 2025-03-23 21:58:26 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:26.612906 | orchestrator | 2025-03-23 21:58:26 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:26.615076 | orchestrator | 2025-03-23 21:58:26 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:26.618517 | orchestrator | 2025-03-23 21:58:26 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:29.665857 | orchestrator | 2025-03-23 21:58:26 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:29.665993 | orchestrator | 2025-03-23 21:58:29 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:29.666439 | orchestrator | 2025-03-23 21:58:29 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:29.670988 | orchestrator | 2025-03-23 21:58:29 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:29.673448 | orchestrator | 2025-03-23 21:58:29 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:29.674880 | orchestrator | 2025-03-23 21:58:29 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:32.723467 | orchestrator | 2025-03-23 21:58:29 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:32.723679 | orchestrator | 2025-03-23 21:58:32 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:32.723771 | orchestrator | 2025-03-23 21:58:32 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:32.724359 | orchestrator | 2025-03-23 21:58:32 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:32.726700 | orchestrator | 2025-03-23 21:58:32 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:35.773688 | orchestrator | 2025-03-23 21:58:32 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:35.773798 | orchestrator | 2025-03-23 21:58:32 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:35.773833 | orchestrator | 2025-03-23 21:58:35 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:35.778199 | orchestrator | 2025-03-23 21:58:35 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:35.779182 | orchestrator | 2025-03-23 21:58:35 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:35.779213 | orchestrator | 2025-03-23 21:58:35 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:35.781073 | orchestrator | 2025-03-23 21:58:35 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:35.781315 | orchestrator | 2025-03-23 21:58:35 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:38.832837 | orchestrator | 2025-03-23 21:58:38 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:38.834365 | orchestrator | 2025-03-23 21:58:38 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:38.835420 | orchestrator | 2025-03-23 21:58:38 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:38.836847 | orchestrator | 2025-03-23 21:58:38 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:38.839180 | orchestrator | 2025-03-23 21:58:38 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state STARTED 2025-03-23 21:58:41.880833 | orchestrator | 2025-03-23 21:58:38 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:41.880985 | orchestrator | 2025-03-23 21:58:41 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:41.881858 | orchestrator | 2025-03-23 21:58:41 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:41.884649 | orchestrator | 2025-03-23 21:58:41 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:41.885959 | orchestrator | 2025-03-23 21:58:41 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:41.887764 | orchestrator | 2025-03-23 21:58:41 | INFO  | Task 371e1403-f3a7-4305-bd63-6e197a55c018 is in state SUCCESS 2025-03-23 21:58:41.889290 | orchestrator | 2025-03-23 21:58:41.889328 | orchestrator | 2025-03-23 21:58:41.889344 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-23 21:58:41.889360 | orchestrator | 2025-03-23 21:58:41.889375 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-23 21:58:41.889390 | orchestrator | Sunday 23 March 2025 21:57:07 +0000 (0:00:00.556) 0:00:00.556 ********** 2025-03-23 21:58:41.889405 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:58:41.889433 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:58:41.889448 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:58:41.889463 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:58:41.889478 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:58:41.889492 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:58:41.889507 | orchestrator | 2025-03-23 21:58:41.889522 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-23 21:58:41.889537 | orchestrator | Sunday 23 March 2025 21:57:08 +0000 (0:00:01.343) 0:00:01.899 ********** 2025-03-23 21:58:41.889606 | orchestrator | ok: [testbed-node-3] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-23 21:58:41.889623 | orchestrator | ok: [testbed-node-4] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-23 21:58:41.889637 | orchestrator | ok: [testbed-node-5] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-23 21:58:41.889652 | orchestrator | ok: [testbed-node-0] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-23 21:58:41.889666 | orchestrator | ok: [testbed-node-1] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-23 21:58:41.889680 | orchestrator | ok: [testbed-node-2] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-23 21:58:41.889695 | orchestrator | 2025-03-23 21:58:41.889709 | orchestrator | PLAY [Apply role openvswitch] ************************************************** 2025-03-23 21:58:41.889723 | orchestrator | 2025-03-23 21:58:41.889742 | orchestrator | TASK [openvswitch : include_tasks] ********************************************* 2025-03-23 21:58:41.889757 | orchestrator | Sunday 23 March 2025 21:57:10 +0000 (0:00:01.632) 0:00:03.532 ********** 2025-03-23 21:58:41.889772 | orchestrator | included: /ansible/roles/openvswitch/tasks/deploy.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 21:58:41.889789 | orchestrator | 2025-03-23 21:58:41.889803 | orchestrator | TASK [module-load : Load modules] ********************************************** 2025-03-23 21:58:41.889818 | orchestrator | Sunday 23 March 2025 21:57:12 +0000 (0:00:02.227) 0:00:05.759 ********** 2025-03-23 21:58:41.889832 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2025-03-23 21:58:41.889847 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2025-03-23 21:58:41.889861 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2025-03-23 21:58:41.889876 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2025-03-23 21:58:41.889892 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2025-03-23 21:58:41.889907 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2025-03-23 21:58:41.889923 | orchestrator | 2025-03-23 21:58:41.889938 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2025-03-23 21:58:41.889954 | orchestrator | Sunday 23 March 2025 21:57:15 +0000 (0:00:02.944) 0:00:08.704 ********** 2025-03-23 21:58:41.889969 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2025-03-23 21:58:41.889990 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2025-03-23 21:58:41.890006 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2025-03-23 21:58:41.890114 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2025-03-23 21:58:41.890130 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2025-03-23 21:58:41.890146 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2025-03-23 21:58:41.890161 | orchestrator | 2025-03-23 21:58:41.890177 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2025-03-23 21:58:41.890192 | orchestrator | Sunday 23 March 2025 21:57:18 +0000 (0:00:02.986) 0:00:11.691 ********** 2025-03-23 21:58:41.890208 | orchestrator | skipping: [testbed-node-3] => (item=openvswitch)  2025-03-23 21:58:41.890223 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:58:41.890239 | orchestrator | skipping: [testbed-node-4] => (item=openvswitch)  2025-03-23 21:58:41.890253 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:58:41.890267 | orchestrator | skipping: [testbed-node-5] => (item=openvswitch)  2025-03-23 21:58:41.890281 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:58:41.890295 | orchestrator | skipping: [testbed-node-0] => (item=openvswitch)  2025-03-23 21:58:41.890309 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:58:41.890323 | orchestrator | skipping: [testbed-node-1] => (item=openvswitch)  2025-03-23 21:58:41.890337 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:58:41.890351 | orchestrator | skipping: [testbed-node-2] => (item=openvswitch)  2025-03-23 21:58:41.890365 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:58:41.890388 | orchestrator | 2025-03-23 21:58:41.890402 | orchestrator | TASK [openvswitch : Create /run/openvswitch directory on host] ***************** 2025-03-23 21:58:41.890417 | orchestrator | Sunday 23 March 2025 21:57:22 +0000 (0:00:03.519) 0:00:15.210 ********** 2025-03-23 21:58:41.890430 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:58:41.890444 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:58:41.890458 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:58:41.890472 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:58:41.890486 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:58:41.890500 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:58:41.890514 | orchestrator | 2025-03-23 21:58:41.890528 | orchestrator | TASK [openvswitch : Ensuring config directories exist] ************************* 2025-03-23 21:58:41.890542 | orchestrator | Sunday 23 March 2025 21:57:23 +0000 (0:00:01.015) 0:00:16.226 ********** 2025-03-23 21:58:41.890591 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890613 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890629 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890643 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890658 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890684 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890705 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890728 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890743 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890758 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890780 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890807 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890823 | orchestrator | 2025-03-23 21:58:41.890837 | orchestrator | TASK [openvswitch : Copying over config.json files for services] *************** 2025-03-23 21:58:41.890851 | orchestrator | Sunday 23 March 2025 21:57:26 +0000 (0:00:02.898) 0:00:19.124 ********** 2025-03-23 21:58:41.890866 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890880 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890895 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890916 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890931 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.890983 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891000 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891015 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891029 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891061 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891083 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891098 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891113 | orchestrator | 2025-03-23 21:58:41.891127 | orchestrator | TASK [openvswitch : Copying over start-ovs file for openvswitch-vswitchd] ****** 2025-03-23 21:58:41.891142 | orchestrator | Sunday 23 March 2025 21:57:30 +0000 (0:00:04.463) 0:00:23.587 ********** 2025-03-23 21:58:41.891156 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:58:41.891171 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:58:41.891185 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:58:41.891199 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:58:41.891213 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:58:41.891227 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:58:41.891241 | orchestrator | 2025-03-23 21:58:41.891255 | orchestrator | TASK [openvswitch : Copying over start-ovsdb-server files for openvswitch-db-server] *** 2025-03-23 21:58:41.891269 | orchestrator | Sunday 23 March 2025 21:57:35 +0000 (0:00:04.436) 0:00:28.024 ********** 2025-03-23 21:58:41.891283 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:58:41.891297 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:58:41.891311 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:58:41.891325 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:58:41.891339 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:58:41.891353 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:58:41.891367 | orchestrator | 2025-03-23 21:58:41.891382 | orchestrator | TASK [openvswitch : Copying over ovs-vsctl wrapper] **************************** 2025-03-23 21:58:41.891408 | orchestrator | Sunday 23 March 2025 21:57:40 +0000 (0:00:05.466) 0:00:33.491 ********** 2025-03-23 21:58:41.891423 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:58:41.891437 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:58:41.891451 | orchestrator | skipping: [testbed-node-0] 2025-03-23 21:58:41.891466 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:58:41.891479 | orchestrator | skipping: [testbed-node-1] 2025-03-23 21:58:41.891493 | orchestrator | skipping: [testbed-node-2] 2025-03-23 21:58:41.891507 | orchestrator | 2025-03-23 21:58:41.891521 | orchestrator | TASK [openvswitch : Check openvswitch containers] ****************************** 2025-03-23 21:58:41.891536 | orchestrator | Sunday 23 March 2025 21:57:42 +0000 (0:00:02.017) 0:00:35.509 ********** 2025-03-23 21:58:41.891623 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891641 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891674 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891691 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891706 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891729 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891748 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891764 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891792 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891808 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891831 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891846 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-23 21:58:41.891861 | orchestrator | 2025-03-23 21:58:41.891875 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-23 21:58:41.891890 | orchestrator | Sunday 23 March 2025 21:57:45 +0000 (0:00:03.280) 0:00:38.789 ********** 2025-03-23 21:58:41.891904 | orchestrator | 2025-03-23 21:58:41.891918 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-23 21:58:41.891932 | orchestrator | Sunday 23 March 2025 21:57:46 +0000 (0:00:00.212) 0:00:39.001 ********** 2025-03-23 21:58:41.891947 | orchestrator | 2025-03-23 21:58:41.891961 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-23 21:58:41.891975 | orchestrator | Sunday 23 March 2025 21:57:46 +0000 (0:00:00.376) 0:00:39.377 ********** 2025-03-23 21:58:41.891990 | orchestrator | 2025-03-23 21:58:41.892005 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-23 21:58:41.892020 | orchestrator | Sunday 23 March 2025 21:57:46 +0000 (0:00:00.175) 0:00:39.553 ********** 2025-03-23 21:58:41.892034 | orchestrator | 2025-03-23 21:58:41.892048 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-23 21:58:41.892062 | orchestrator | Sunday 23 March 2025 21:57:47 +0000 (0:00:00.374) 0:00:39.927 ********** 2025-03-23 21:58:41.892076 | orchestrator | 2025-03-23 21:58:41.892095 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-23 21:58:41.892110 | orchestrator | Sunday 23 March 2025 21:57:47 +0000 (0:00:00.348) 0:00:40.276 ********** 2025-03-23 21:58:41.892124 | orchestrator | 2025-03-23 21:58:41.892139 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-db-server container] ******** 2025-03-23 21:58:41.892154 | orchestrator | Sunday 23 March 2025 21:57:47 +0000 (0:00:00.611) 0:00:40.887 ********** 2025-03-23 21:58:41.892168 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:58:41.892183 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:58:41.892197 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:58:41.892211 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:58:41.892225 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:58:41.892239 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:58:41.892253 | orchestrator | 2025-03-23 21:58:41.892267 | orchestrator | RUNNING HANDLER [openvswitch : Waiting for openvswitch_db service to be ready] *** 2025-03-23 21:58:41.892281 | orchestrator | Sunday 23 March 2025 21:57:59 +0000 (0:00:11.486) 0:00:52.374 ********** 2025-03-23 21:58:41.892301 | orchestrator | ok: [testbed-node-3] 2025-03-23 21:58:41.892316 | orchestrator | ok: [testbed-node-4] 2025-03-23 21:58:41.892339 | orchestrator | ok: [testbed-node-5] 2025-03-23 21:58:41.892353 | orchestrator | ok: [testbed-node-0] 2025-03-23 21:58:41.892367 | orchestrator | ok: [testbed-node-1] 2025-03-23 21:58:41.892381 | orchestrator | ok: [testbed-node-2] 2025-03-23 21:58:41.892395 | orchestrator | 2025-03-23 21:58:41.892410 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-vswitchd container] ********* 2025-03-23 21:58:41.892424 | orchestrator | Sunday 23 March 2025 21:58:02 +0000 (0:00:03.343) 0:00:55.717 ********** 2025-03-23 21:58:41.892438 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:58:41.892452 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:58:41.892466 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:58:41.892481 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:58:41.892495 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:58:41.892518 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:58:41.892533 | orchestrator | 2025-03-23 21:58:41.892548 | orchestrator | TASK [openvswitch : Set system-id, hostname and hw-offload] ******************** 2025-03-23 21:58:41.892618 | orchestrator | Sunday 23 March 2025 21:58:13 +0000 (0:00:10.482) 0:01:06.200 ********** 2025-03-23 21:58:41.892633 | orchestrator | changed: [testbed-node-3] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-3'}) 2025-03-23 21:58:41.892648 | orchestrator | changed: [testbed-node-4] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-4'}) 2025-03-23 21:58:41.892662 | orchestrator | changed: [testbed-node-5] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-5'}) 2025-03-23 21:58:41.892675 | orchestrator | changed: [testbed-node-1] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-1'}) 2025-03-23 21:58:41.892688 | orchestrator | changed: [testbed-node-0] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-0'}) 2025-03-23 21:58:41.892701 | orchestrator | changed: [testbed-node-2] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-2'}) 2025-03-23 21:58:41.892714 | orchestrator | changed: [testbed-node-3] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-3'}) 2025-03-23 21:58:41.892727 | orchestrator | changed: [testbed-node-4] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-4'}) 2025-03-23 21:58:41.892740 | orchestrator | changed: [testbed-node-5] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-5'}) 2025-03-23 21:58:41.892752 | orchestrator | changed: [testbed-node-1] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-1'}) 2025-03-23 21:58:41.892765 | orchestrator | changed: [testbed-node-0] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-0'}) 2025-03-23 21:58:41.892777 | orchestrator | changed: [testbed-node-2] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-2'}) 2025-03-23 21:58:41.892790 | orchestrator | ok: [testbed-node-3] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-23 21:58:41.892803 | orchestrator | ok: [testbed-node-4] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-23 21:58:41.892815 | orchestrator | ok: [testbed-node-5] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-23 21:58:41.892828 | orchestrator | ok: [testbed-node-1] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-23 21:58:41.892921 | orchestrator | ok: [testbed-node-2] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-23 21:58:41.892937 | orchestrator | ok: [testbed-node-0] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-23 21:58:41.892950 | orchestrator | 2025-03-23 21:58:41.892963 | orchestrator | TASK [openvswitch : Ensuring OVS bridge is properly setup] ********************* 2025-03-23 21:58:41.892975 | orchestrator | Sunday 23 March 2025 21:58:23 +0000 (0:00:10.012) 0:01:16.212 ********** 2025-03-23 21:58:41.892988 | orchestrator | skipping: [testbed-node-3] => (item=br-ex)  2025-03-23 21:58:41.893014 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:58:41.893027 | orchestrator | skipping: [testbed-node-4] => (item=br-ex)  2025-03-23 21:58:41.893040 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:58:41.893053 | orchestrator | skipping: [testbed-node-5] => (item=br-ex)  2025-03-23 21:58:41.893065 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:58:41.893078 | orchestrator | changed: [testbed-node-0] => (item=br-ex) 2025-03-23 21:58:41.893090 | orchestrator | changed: [testbed-node-1] => (item=br-ex) 2025-03-23 21:58:41.893103 | orchestrator | changed: [testbed-node-2] => (item=br-ex) 2025-03-23 21:58:41.893115 | orchestrator | 2025-03-23 21:58:41.893128 | orchestrator | TASK [openvswitch : Ensuring OVS ports are properly setup] ********************* 2025-03-23 21:58:41.893140 | orchestrator | Sunday 23 March 2025 21:58:26 +0000 (0:00:03.086) 0:01:19.298 ********** 2025-03-23 21:58:41.893153 | orchestrator | skipping: [testbed-node-3] => (item=['br-ex', 'vxlan0'])  2025-03-23 21:58:41.893165 | orchestrator | skipping: [testbed-node-3] 2025-03-23 21:58:41.893178 | orchestrator | skipping: [testbed-node-4] => (item=['br-ex', 'vxlan0'])  2025-03-23 21:58:41.893190 | orchestrator | skipping: [testbed-node-4] 2025-03-23 21:58:41.893202 | orchestrator | skipping: [testbed-node-5] => (item=['br-ex', 'vxlan0'])  2025-03-23 21:58:41.893215 | orchestrator | skipping: [testbed-node-5] 2025-03-23 21:58:41.893227 | orchestrator | changed: [testbed-node-0] => (item=['br-ex', 'vxlan0']) 2025-03-23 21:58:41.893247 | orchestrator | changed: [testbed-node-2] => (item=['br-ex', 'vxlan0']) 2025-03-23 21:58:44.926641 | orchestrator | changed: [testbed-node-1] => (item=['br-ex', 'vxlan0']) 2025-03-23 21:58:44.926764 | orchestrator | 2025-03-23 21:58:44.926785 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-vswitchd container] ********* 2025-03-23 21:58:44.926801 | orchestrator | Sunday 23 March 2025 21:58:31 +0000 (0:00:05.346) 0:01:24.644 ********** 2025-03-23 21:58:44.926816 | orchestrator | changed: [testbed-node-3] 2025-03-23 21:58:44.926831 | orchestrator | changed: [testbed-node-4] 2025-03-23 21:58:44.926845 | orchestrator | changed: [testbed-node-5] 2025-03-23 21:58:44.926859 | orchestrator | changed: [testbed-node-1] 2025-03-23 21:58:44.926873 | orchestrator | changed: [testbed-node-0] 2025-03-23 21:58:44.926888 | orchestrator | changed: [testbed-node-2] 2025-03-23 21:58:44.926902 | orchestrator | 2025-03-23 21:58:44.926916 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 21:58:44.926931 | orchestrator | testbed-node-0 : ok=17  changed=13  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 21:58:44.926947 | orchestrator | testbed-node-1 : ok=17  changed=13  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 21:58:44.926961 | orchestrator | testbed-node-2 : ok=17  changed=13  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-23 21:58:44.926975 | orchestrator | testbed-node-3 : ok=15  changed=11  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-23 21:58:44.926988 | orchestrator | testbed-node-4 : ok=15  changed=11  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-23 21:58:44.927022 | orchestrator | testbed-node-5 : ok=15  changed=11  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-23 21:58:44.927037 | orchestrator | 2025-03-23 21:58:44.927050 | orchestrator | 2025-03-23 21:58:44.927064 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-23 21:58:44.927079 | orchestrator | Sunday 23 March 2025 21:58:41 +0000 (0:00:09.408) 0:01:34.053 ********** 2025-03-23 21:58:44.927093 | orchestrator | =============================================================================== 2025-03-23 21:58:44.927106 | orchestrator | openvswitch : Restart openvswitch-vswitchd container ------------------- 19.89s 2025-03-23 21:58:44.927145 | orchestrator | openvswitch : Restart openvswitch-db-server container ------------------ 11.49s 2025-03-23 21:58:44.927162 | orchestrator | openvswitch : Set system-id, hostname and hw-offload ------------------- 10.01s 2025-03-23 21:58:44.927177 | orchestrator | openvswitch : Copying over start-ovsdb-server files for openvswitch-db-server --- 5.47s 2025-03-23 21:58:44.927193 | orchestrator | openvswitch : Ensuring OVS ports are properly setup --------------------- 5.35s 2025-03-23 21:58:44.927208 | orchestrator | openvswitch : Copying over config.json files for services --------------- 4.46s 2025-03-23 21:58:44.927223 | orchestrator | openvswitch : Copying over start-ovs file for openvswitch-vswitchd ------ 4.44s 2025-03-23 21:58:44.927239 | orchestrator | module-load : Drop module persistence ----------------------------------- 3.52s 2025-03-23 21:58:44.927254 | orchestrator | openvswitch : Waiting for openvswitch_db service to be ready ------------ 3.34s 2025-03-23 21:58:44.927269 | orchestrator | openvswitch : Check openvswitch containers ------------------------------ 3.28s 2025-03-23 21:58:44.927283 | orchestrator | openvswitch : Ensuring OVS bridge is properly setup --------------------- 3.09s 2025-03-23 21:58:44.927298 | orchestrator | module-load : Persist modules via modules-load.d ------------------------ 2.99s 2025-03-23 21:58:44.927318 | orchestrator | module-load : Load modules ---------------------------------------------- 2.94s 2025-03-23 21:58:44.927334 | orchestrator | openvswitch : Ensuring config directories exist ------------------------- 2.90s 2025-03-23 21:58:44.927349 | orchestrator | openvswitch : include_tasks --------------------------------------------- 2.23s 2025-03-23 21:58:44.927364 | orchestrator | openvswitch : Flush Handlers -------------------------------------------- 2.10s 2025-03-23 21:58:44.927379 | orchestrator | openvswitch : Copying over ovs-vsctl wrapper ---------------------------- 2.02s 2025-03-23 21:58:44.927394 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.63s 2025-03-23 21:58:44.927409 | orchestrator | Group hosts based on Kolla action --------------------------------------- 1.34s 2025-03-23 21:58:44.927424 | orchestrator | openvswitch : Create /run/openvswitch directory on host ----------------- 1.02s 2025-03-23 21:58:44.927439 | orchestrator | 2025-03-23 21:58:41 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:44.927471 | orchestrator | 2025-03-23 21:58:44 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:44.930299 | orchestrator | 2025-03-23 21:58:44 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:44.932754 | orchestrator | 2025-03-23 21:58:44 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:44.935175 | orchestrator | 2025-03-23 21:58:44 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:44.936437 | orchestrator | 2025-03-23 21:58:44 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:58:44.936791 | orchestrator | 2025-03-23 21:58:44 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:47.988036 | orchestrator | 2025-03-23 21:58:47 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:47.989128 | orchestrator | 2025-03-23 21:58:47 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:47.989955 | orchestrator | 2025-03-23 21:58:47 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:47.993917 | orchestrator | 2025-03-23 21:58:47 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:47.994679 | orchestrator | 2025-03-23 21:58:47 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:58:51.063777 | orchestrator | 2025-03-23 21:58:47 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:51.063913 | orchestrator | 2025-03-23 21:58:51 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:51.066440 | orchestrator | 2025-03-23 21:58:51 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:51.067178 | orchestrator | 2025-03-23 21:58:51 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:51.069334 | orchestrator | 2025-03-23 21:58:51 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:51.071468 | orchestrator | 2025-03-23 21:58:51 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:58:54.105067 | orchestrator | 2025-03-23 21:58:51 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:54.105203 | orchestrator | 2025-03-23 21:58:54 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:54.106369 | orchestrator | 2025-03-23 21:58:54 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:54.109598 | orchestrator | 2025-03-23 21:58:54 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:54.113928 | orchestrator | 2025-03-23 21:58:54 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:54.121175 | orchestrator | 2025-03-23 21:58:54 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:58:57.193499 | orchestrator | 2025-03-23 21:58:54 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:58:57.193711 | orchestrator | 2025-03-23 21:58:57 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:58:57.196030 | orchestrator | 2025-03-23 21:58:57 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:58:57.201003 | orchestrator | 2025-03-23 21:58:57 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:58:57.205487 | orchestrator | 2025-03-23 21:58:57 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:58:57.208724 | orchestrator | 2025-03-23 21:58:57 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:00.259747 | orchestrator | 2025-03-23 21:58:57 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:00.259875 | orchestrator | 2025-03-23 21:59:00 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:00.261913 | orchestrator | 2025-03-23 21:59:00 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:00.261947 | orchestrator | 2025-03-23 21:59:00 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:00.264312 | orchestrator | 2025-03-23 21:59:00 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:00.265427 | orchestrator | 2025-03-23 21:59:00 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:00.265745 | orchestrator | 2025-03-23 21:59:00 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:03.307827 | orchestrator | 2025-03-23 21:59:03 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:03.311117 | orchestrator | 2025-03-23 21:59:03 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:03.311548 | orchestrator | 2025-03-23 21:59:03 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:03.311609 | orchestrator | 2025-03-23 21:59:03 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:03.311630 | orchestrator | 2025-03-23 21:59:03 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:06.370517 | orchestrator | 2025-03-23 21:59:03 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:06.370687 | orchestrator | 2025-03-23 21:59:06 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:06.380334 | orchestrator | 2025-03-23 21:59:06 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:06.381047 | orchestrator | 2025-03-23 21:59:06 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:06.381880 | orchestrator | 2025-03-23 21:59:06 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:06.382961 | orchestrator | 2025-03-23 21:59:06 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:09.426274 | orchestrator | 2025-03-23 21:59:06 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:09.426408 | orchestrator | 2025-03-23 21:59:09 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:09.432678 | orchestrator | 2025-03-23 21:59:09 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:09.438405 | orchestrator | 2025-03-23 21:59:09 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:09.444459 | orchestrator | 2025-03-23 21:59:09 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:09.449874 | orchestrator | 2025-03-23 21:59:09 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:12.508959 | orchestrator | 2025-03-23 21:59:09 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:12.509095 | orchestrator | 2025-03-23 21:59:12 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:12.511026 | orchestrator | 2025-03-23 21:59:12 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:12.512240 | orchestrator | 2025-03-23 21:59:12 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:12.513989 | orchestrator | 2025-03-23 21:59:12 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:12.515182 | orchestrator | 2025-03-23 21:59:12 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:12.515740 | orchestrator | 2025-03-23 21:59:12 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:15.570983 | orchestrator | 2025-03-23 21:59:15 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:15.573256 | orchestrator | 2025-03-23 21:59:15 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:15.576376 | orchestrator | 2025-03-23 21:59:15 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:15.579585 | orchestrator | 2025-03-23 21:59:15 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:15.581997 | orchestrator | 2025-03-23 21:59:15 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:15.582682 | orchestrator | 2025-03-23 21:59:15 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:18.628684 | orchestrator | 2025-03-23 21:59:18 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:18.632662 | orchestrator | 2025-03-23 21:59:18 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:18.635466 | orchestrator | 2025-03-23 21:59:18 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:18.636173 | orchestrator | 2025-03-23 21:59:18 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:18.636238 | orchestrator | 2025-03-23 21:59:18 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:18.637075 | orchestrator | 2025-03-23 21:59:18 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:21.680972 | orchestrator | 2025-03-23 21:59:21 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:21.684874 | orchestrator | 2025-03-23 21:59:21 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:21.685811 | orchestrator | 2025-03-23 21:59:21 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:21.685846 | orchestrator | 2025-03-23 21:59:21 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:21.686336 | orchestrator | 2025-03-23 21:59:21 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:24.747831 | orchestrator | 2025-03-23 21:59:21 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:24.747934 | orchestrator | 2025-03-23 21:59:24 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:24.748977 | orchestrator | 2025-03-23 21:59:24 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:24.750733 | orchestrator | 2025-03-23 21:59:24 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:24.752790 | orchestrator | 2025-03-23 21:59:24 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:24.754448 | orchestrator | 2025-03-23 21:59:24 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:27.806799 | orchestrator | 2025-03-23 21:59:24 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:27.806933 | orchestrator | 2025-03-23 21:59:27 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:27.807233 | orchestrator | 2025-03-23 21:59:27 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:27.807862 | orchestrator | 2025-03-23 21:59:27 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:27.808736 | orchestrator | 2025-03-23 21:59:27 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:27.809581 | orchestrator | 2025-03-23 21:59:27 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:30.851661 | orchestrator | 2025-03-23 21:59:27 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:30.851789 | orchestrator | 2025-03-23 21:59:30 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:30.852045 | orchestrator | 2025-03-23 21:59:30 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:30.852930 | orchestrator | 2025-03-23 21:59:30 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:30.853831 | orchestrator | 2025-03-23 21:59:30 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:30.854769 | orchestrator | 2025-03-23 21:59:30 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:33.894328 | orchestrator | 2025-03-23 21:59:30 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:33.894461 | orchestrator | 2025-03-23 21:59:33 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:33.894682 | orchestrator | 2025-03-23 21:59:33 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:33.900109 | orchestrator | 2025-03-23 21:59:33 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:33.902317 | orchestrator | 2025-03-23 21:59:33 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:33.902351 | orchestrator | 2025-03-23 21:59:33 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:33.902447 | orchestrator | 2025-03-23 21:59:33 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:36.974706 | orchestrator | 2025-03-23 21:59:36 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:36.979639 | orchestrator | 2025-03-23 21:59:36 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:36.981825 | orchestrator | 2025-03-23 21:59:36 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:36.985606 | orchestrator | 2025-03-23 21:59:36 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:36.990181 | orchestrator | 2025-03-23 21:59:36 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:36.990624 | orchestrator | 2025-03-23 21:59:36 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:40.066470 | orchestrator | 2025-03-23 21:59:40 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:40.066998 | orchestrator | 2025-03-23 21:59:40 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:40.068667 | orchestrator | 2025-03-23 21:59:40 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:40.068747 | orchestrator | 2025-03-23 21:59:40 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:40.069456 | orchestrator | 2025-03-23 21:59:40 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:43.123085 | orchestrator | 2025-03-23 21:59:40 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:43.123215 | orchestrator | 2025-03-23 21:59:43 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:43.127025 | orchestrator | 2025-03-23 21:59:43 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:43.128485 | orchestrator | 2025-03-23 21:59:43 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:43.133261 | orchestrator | 2025-03-23 21:59:43 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:46.210419 | orchestrator | 2025-03-23 21:59:43 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:46.210543 | orchestrator | 2025-03-23 21:59:43 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:46.210640 | orchestrator | 2025-03-23 21:59:46 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:46.212578 | orchestrator | 2025-03-23 21:59:46 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:46.213607 | orchestrator | 2025-03-23 21:59:46 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:46.215272 | orchestrator | 2025-03-23 21:59:46 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:46.217367 | orchestrator | 2025-03-23 21:59:46 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:49.268685 | orchestrator | 2025-03-23 21:59:46 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:49.268856 | orchestrator | 2025-03-23 21:59:49 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:49.271335 | orchestrator | 2025-03-23 21:59:49 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:49.274638 | orchestrator | 2025-03-23 21:59:49 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:49.276014 | orchestrator | 2025-03-23 21:59:49 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:49.277800 | orchestrator | 2025-03-23 21:59:49 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:52.319719 | orchestrator | 2025-03-23 21:59:49 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:52.319868 | orchestrator | 2025-03-23 21:59:52 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:52.321771 | orchestrator | 2025-03-23 21:59:52 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:52.323740 | orchestrator | 2025-03-23 21:59:52 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:52.325460 | orchestrator | 2025-03-23 21:59:52 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:52.326686 | orchestrator | 2025-03-23 21:59:52 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:55.380023 | orchestrator | 2025-03-23 21:59:52 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:55.380157 | orchestrator | 2025-03-23 21:59:55 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:55.382797 | orchestrator | 2025-03-23 21:59:55 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:55.383655 | orchestrator | 2025-03-23 21:59:55 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:55.384223 | orchestrator | 2025-03-23 21:59:55 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:55.384914 | orchestrator | 2025-03-23 21:59:55 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 21:59:58.442155 | orchestrator | 2025-03-23 21:59:55 | INFO  | Wait 1 second(s) until the next check 2025-03-23 21:59:58.442292 | orchestrator | 2025-03-23 21:59:58 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 21:59:58.442521 | orchestrator | 2025-03-23 21:59:58 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 21:59:58.442549 | orchestrator | 2025-03-23 21:59:58 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 21:59:58.442621 | orchestrator | 2025-03-23 21:59:58 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 21:59:58.443338 | orchestrator | 2025-03-23 21:59:58 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:01.492620 | orchestrator | 2025-03-23 21:59:58 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:01.492759 | orchestrator | 2025-03-23 22:00:01 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:01.496105 | orchestrator | 2025-03-23 22:00:01 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:01.496223 | orchestrator | 2025-03-23 22:00:01 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 22:00:01.496308 | orchestrator | 2025-03-23 22:00:01 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:01.498175 | orchestrator | 2025-03-23 22:00:01 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:04.538838 | orchestrator | 2025-03-23 22:00:01 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:04.539003 | orchestrator | 2025-03-23 22:00:04 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:04.542857 | orchestrator | 2025-03-23 22:00:04 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:04.544835 | orchestrator | 2025-03-23 22:00:04 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 22:00:04.549120 | orchestrator | 2025-03-23 22:00:04 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:04.550303 | orchestrator | 2025-03-23 22:00:04 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:07.589890 | orchestrator | 2025-03-23 22:00:04 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:07.590157 | orchestrator | 2025-03-23 22:00:07 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:07.590267 | orchestrator | 2025-03-23 22:00:07 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:07.590962 | orchestrator | 2025-03-23 22:00:07 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 22:00:07.591339 | orchestrator | 2025-03-23 22:00:07 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:07.592545 | orchestrator | 2025-03-23 22:00:07 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:10.645464 | orchestrator | 2025-03-23 22:00:07 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:10.645691 | orchestrator | 2025-03-23 22:00:10 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:10.645785 | orchestrator | 2025-03-23 22:00:10 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:10.646425 | orchestrator | 2025-03-23 22:00:10 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state STARTED 2025-03-23 22:00:10.647241 | orchestrator | 2025-03-23 22:00:10 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:10.648181 | orchestrator | 2025-03-23 22:00:10 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:13.690708 | orchestrator | 2025-03-23 22:00:10 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:13.690851 | orchestrator | 2025-03-23 22:00:13 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:13.691291 | orchestrator | 2025-03-23 22:00:13 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:13.692749 | orchestrator | 2025-03-23 22:00:13 | INFO  | Task aa6bad5e-97be-40e3-98ae-1bbfade1dac3 is in state SUCCESS 2025-03-23 22:00:13.694235 | orchestrator | 2025-03-23 22:00:13.694275 | orchestrator | 2025-03-23 22:00:13.694291 | orchestrator | PLAY [Set kolla_action_rabbitmq] *********************************************** 2025-03-23 22:00:13.694307 | orchestrator | 2025-03-23 22:00:13.694322 | orchestrator | TASK [Inform the user about the following task] ******************************** 2025-03-23 22:00:13.694337 | orchestrator | Sunday 23 March 2025 21:57:40 +0000 (0:00:00.255) 0:00:00.255 ********** 2025-03-23 22:00:13.694351 | orchestrator | ok: [localhost] => { 2025-03-23 22:00:13.694369 | orchestrator |  "msg": "The task 'Check RabbitMQ service' fails if the RabbitMQ service has not yet been deployed. This is fine." 2025-03-23 22:00:13.694384 | orchestrator | } 2025-03-23 22:00:13.694399 | orchestrator | 2025-03-23 22:00:13.694415 | orchestrator | TASK [Check RabbitMQ service] ************************************************** 2025-03-23 22:00:13.694454 | orchestrator | Sunday 23 March 2025 21:57:40 +0000 (0:00:00.104) 0:00:00.360 ********** 2025-03-23 22:00:13.694470 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string RabbitMQ Management in 192.168.16.9:15672"} 2025-03-23 22:00:13.694486 | orchestrator | ...ignoring 2025-03-23 22:00:13.694501 | orchestrator | 2025-03-23 22:00:13.694516 | orchestrator | TASK [Set kolla_action_rabbitmq = upgrade if RabbitMQ is already running] ****** 2025-03-23 22:00:13.694530 | orchestrator | Sunday 23 March 2025 21:57:44 +0000 (0:00:03.708) 0:00:04.069 ********** 2025-03-23 22:00:13.694544 | orchestrator | skipping: [localhost] 2025-03-23 22:00:13.694588 | orchestrator | 2025-03-23 22:00:13.694602 | orchestrator | TASK [Set kolla_action_rabbitmq = kolla_action_ng] ***************************** 2025-03-23 22:00:13.694617 | orchestrator | Sunday 23 March 2025 21:57:44 +0000 (0:00:00.110) 0:00:04.179 ********** 2025-03-23 22:00:13.694631 | orchestrator | ok: [localhost] 2025-03-23 22:00:13.694644 | orchestrator | 2025-03-23 22:00:13.694659 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-23 22:00:13.694672 | orchestrator | 2025-03-23 22:00:13.694686 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-23 22:00:13.694700 | orchestrator | Sunday 23 March 2025 21:57:44 +0000 (0:00:00.228) 0:00:04.408 ********** 2025-03-23 22:00:13.694714 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:00:13.694728 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:00:13.694742 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:00:13.694756 | orchestrator | 2025-03-23 22:00:13.694770 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-23 22:00:13.694904 | orchestrator | Sunday 23 March 2025 21:57:44 +0000 (0:00:00.357) 0:00:04.765 ********** 2025-03-23 22:00:13.694923 | orchestrator | ok: [testbed-node-0] => (item=enable_rabbitmq_True) 2025-03-23 22:00:13.694939 | orchestrator | ok: [testbed-node-1] => (item=enable_rabbitmq_True) 2025-03-23 22:00:13.694955 | orchestrator | ok: [testbed-node-2] => (item=enable_rabbitmq_True) 2025-03-23 22:00:13.694971 | orchestrator | 2025-03-23 22:00:13.694987 | orchestrator | PLAY [Apply role rabbitmq] ***************************************************** 2025-03-23 22:00:13.695001 | orchestrator | 2025-03-23 22:00:13.695017 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-03-23 22:00:13.695032 | orchestrator | Sunday 23 March 2025 21:57:45 +0000 (0:00:00.575) 0:00:05.341 ********** 2025-03-23 22:00:13.695048 | orchestrator | included: /ansible/roles/rabbitmq/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:00:13.695064 | orchestrator | 2025-03-23 22:00:13.695079 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2025-03-23 22:00:13.695094 | orchestrator | Sunday 23 March 2025 21:57:47 +0000 (0:00:02.119) 0:00:07.460 ********** 2025-03-23 22:00:13.695110 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:00:13.695125 | orchestrator | 2025-03-23 22:00:13.695140 | orchestrator | TASK [rabbitmq : Get current RabbitMQ version] ********************************* 2025-03-23 22:00:13.695156 | orchestrator | Sunday 23 March 2025 21:57:50 +0000 (0:00:02.853) 0:00:10.314 ********** 2025-03-23 22:00:13.695172 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:00:13.695187 | orchestrator | 2025-03-23 22:00:13.695201 | orchestrator | TASK [rabbitmq : Get new RabbitMQ version] ************************************* 2025-03-23 22:00:13.695215 | orchestrator | Sunday 23 March 2025 21:57:51 +0000 (0:00:00.689) 0:00:11.003 ********** 2025-03-23 22:00:13.695229 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:00:13.695243 | orchestrator | 2025-03-23 22:00:13.695257 | orchestrator | TASK [rabbitmq : Check if running RabbitMQ is at most one version behind] ****** 2025-03-23 22:00:13.695278 | orchestrator | Sunday 23 March 2025 21:57:51 +0000 (0:00:00.780) 0:00:11.784 ********** 2025-03-23 22:00:13.695292 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:00:13.695306 | orchestrator | 2025-03-23 22:00:13.695320 | orchestrator | TASK [rabbitmq : Catch when RabbitMQ is being downgraded] ********************** 2025-03-23 22:00:13.695334 | orchestrator | Sunday 23 March 2025 21:57:52 +0000 (0:00:00.389) 0:00:12.173 ********** 2025-03-23 22:00:13.695357 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:00:13.695371 | orchestrator | 2025-03-23 22:00:13.695385 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-03-23 22:00:13.695399 | orchestrator | Sunday 23 March 2025 21:57:52 +0000 (0:00:00.382) 0:00:12.556 ********** 2025-03-23 22:00:13.695414 | orchestrator | included: /ansible/roles/rabbitmq/tasks/remove-ha-all-policy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:00:13.695428 | orchestrator | 2025-03-23 22:00:13.695442 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2025-03-23 22:00:13.695456 | orchestrator | Sunday 23 March 2025 21:57:53 +0000 (0:00:01.126) 0:00:13.682 ********** 2025-03-23 22:00:13.695470 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:00:13.695484 | orchestrator | 2025-03-23 22:00:13.695498 | orchestrator | TASK [rabbitmq : List RabbitMQ policies] *************************************** 2025-03-23 22:00:13.695512 | orchestrator | Sunday 23 March 2025 21:57:54 +0000 (0:00:00.880) 0:00:14.563 ********** 2025-03-23 22:00:13.695526 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:00:13.695539 | orchestrator | 2025-03-23 22:00:13.695576 | orchestrator | TASK [rabbitmq : Remove ha-all policy from RabbitMQ] *************************** 2025-03-23 22:00:13.695591 | orchestrator | Sunday 23 March 2025 21:57:55 +0000 (0:00:00.430) 0:00:14.993 ********** 2025-03-23 22:00:13.695606 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:00:13.695620 | orchestrator | 2025-03-23 22:00:13.695643 | orchestrator | TASK [rabbitmq : Ensuring config directories exist] **************************** 2025-03-23 22:00:13.695658 | orchestrator | Sunday 23 March 2025 21:57:55 +0000 (0:00:00.398) 0:00:15.392 ********** 2025-03-23 22:00:13.695675 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:00:13.695694 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:00:13.695709 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:00:13.695737 | orchestrator | 2025-03-23 22:00:13.695752 | orchestrator | TASK [rabbitmq : Copying over config.json files for services] ****************** 2025-03-23 22:00:13.695766 | orchestrator | Sunday 23 March 2025 21:57:57 +0000 (0:00:01.627) 0:00:17.020 ********** 2025-03-23 22:00:13.695791 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:00:13.695807 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:00:13.695822 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:00:13.695843 | orchestrator | 2025-03-23 22:00:13.695858 | orchestrator | TASK [rabbitmq : Copying over rabbitmq-env.conf] ******************************* 2025-03-23 22:00:13.695872 | orchestrator | Sunday 23 March 2025 21:57:59 +0000 (0:00:02.140) 0:00:19.161 ********** 2025-03-23 22:00:13.695886 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-03-23 22:00:13.695900 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-03-23 22:00:13.695914 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-03-23 22:00:13.695928 | orchestrator | 2025-03-23 22:00:13.695942 | orchestrator | TASK [rabbitmq : Copying over rabbitmq.conf] *********************************** 2025-03-23 22:00:13.695963 | orchestrator | Sunday 23 March 2025 21:58:02 +0000 (0:00:03.647) 0:00:22.808 ********** 2025-03-23 22:00:13.695977 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-03-23 22:00:13.695992 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-03-23 22:00:13.696006 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-03-23 22:00:13.696020 | orchestrator | 2025-03-23 22:00:13.696034 | orchestrator | TASK [rabbitmq : Copying over erl_inetrc] ************************************** 2025-03-23 22:00:13.696048 | orchestrator | Sunday 23 March 2025 21:58:07 +0000 (0:00:04.133) 0:00:26.942 ********** 2025-03-23 22:00:13.696062 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-03-23 22:00:13.696075 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-03-23 22:00:13.696090 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-03-23 22:00:13.696103 | orchestrator | 2025-03-23 22:00:13.696123 | orchestrator | TASK [rabbitmq : Copying over advanced.config] ********************************* 2025-03-23 22:00:13.696138 | orchestrator | Sunday 23 March 2025 21:58:09 +0000 (0:00:02.326) 0:00:29.269 ********** 2025-03-23 22:00:13.696152 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-03-23 22:00:13.696166 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-03-23 22:00:13.696180 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-03-23 22:00:13.696194 | orchestrator | 2025-03-23 22:00:13.696208 | orchestrator | TASK [rabbitmq : Copying over definitions.json] ******************************** 2025-03-23 22:00:13.696222 | orchestrator | Sunday 23 March 2025 21:58:12 +0000 (0:00:02.590) 0:00:31.860 ********** 2025-03-23 22:00:13.696236 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-03-23 22:00:13.696250 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-03-23 22:00:13.696264 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-03-23 22:00:13.696278 | orchestrator | 2025-03-23 22:00:13.696292 | orchestrator | TASK [rabbitmq : Copying over enabled_plugins] ********************************* 2025-03-23 22:00:13.696306 | orchestrator | Sunday 23 March 2025 21:58:14 +0000 (0:00:02.509) 0:00:34.369 ********** 2025-03-23 22:00:13.696320 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-03-23 22:00:13.696334 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-03-23 22:00:13.696355 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-03-23 22:00:13.696369 | orchestrator | 2025-03-23 22:00:13.696383 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-03-23 22:00:13.696401 | orchestrator | Sunday 23 March 2025 21:58:16 +0000 (0:00:02.378) 0:00:36.747 ********** 2025-03-23 22:00:13.696415 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:00:13.696430 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:00:13.696495 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:00:13.696509 | orchestrator | 2025-03-23 22:00:13.696523 | orchestrator | TASK [rabbitmq : Check rabbitmq containers] ************************************ 2025-03-23 22:00:13.696537 | orchestrator | Sunday 23 March 2025 21:58:17 +0000 (0:00:01.015) 0:00:37.763 ********** 2025-03-23 22:00:13.696570 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:00:13.696587 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:00:13.696612 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:00:13.696637 | orchestrator | 2025-03-23 22:00:13.696651 | orchestrator | TASK [rabbitmq : Creating rabbitmq volume] ************************************* 2025-03-23 22:00:13.696665 | orchestrator | Sunday 23 March 2025 21:58:19 +0000 (0:00:02.008) 0:00:39.771 ********** 2025-03-23 22:00:13.696679 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:00:13.696693 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:00:13.696707 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:00:13.696721 | orchestrator | 2025-03-23 22:00:13.696735 | orchestrator | TASK [rabbitmq : Running RabbitMQ bootstrap container] ************************* 2025-03-23 22:00:13.696749 | orchestrator | Sunday 23 March 2025 21:58:21 +0000 (0:00:01.137) 0:00:40.908 ********** 2025-03-23 22:00:13.696763 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:00:13.696777 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:00:13.696791 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:00:13.696805 | orchestrator | 2025-03-23 22:00:13.696819 | orchestrator | RUNNING HANDLER [rabbitmq : Restart rabbitmq container] ************************ 2025-03-23 22:00:13.696834 | orchestrator | Sunday 23 March 2025 21:58:27 +0000 (0:00:06.098) 0:00:47.007 ********** 2025-03-23 22:00:13.696848 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:00:13.696862 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:00:13.696876 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:00:13.696890 | orchestrator | 2025-03-23 22:00:13.696904 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-03-23 22:00:13.696918 | orchestrator | 2025-03-23 22:00:13.696932 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-03-23 22:00:13.696946 | orchestrator | Sunday 23 March 2025 21:58:27 +0000 (0:00:00.343) 0:00:47.350 ********** 2025-03-23 22:00:13.696960 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:00:13.696974 | orchestrator | 2025-03-23 22:00:13.696988 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-03-23 22:00:13.697002 | orchestrator | Sunday 23 March 2025 21:58:28 +0000 (0:00:00.866) 0:00:48.217 ********** 2025-03-23 22:00:13.697016 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:00:13.697030 | orchestrator | 2025-03-23 22:00:13.697045 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-03-23 22:00:13.697059 | orchestrator | Sunday 23 March 2025 21:58:28 +0000 (0:00:00.258) 0:00:48.476 ********** 2025-03-23 22:00:13.697073 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:00:13.697087 | orchestrator | 2025-03-23 22:00:13.697102 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-03-23 22:00:13.697116 | orchestrator | Sunday 23 March 2025 21:58:30 +0000 (0:00:02.150) 0:00:50.626 ********** 2025-03-23 22:00:13.697130 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:00:13.697144 | orchestrator | 2025-03-23 22:00:13.697158 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-03-23 22:00:13.697172 | orchestrator | 2025-03-23 22:00:13.697186 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-03-23 22:00:13.697200 | orchestrator | Sunday 23 March 2025 21:59:30 +0000 (0:00:59.242) 0:01:49.869 ********** 2025-03-23 22:00:13.697214 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:00:13.697228 | orchestrator | 2025-03-23 22:00:13.697242 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-03-23 22:00:13.697256 | orchestrator | Sunday 23 March 2025 21:59:30 +0000 (0:00:00.747) 0:01:50.616 ********** 2025-03-23 22:00:13.697270 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:00:13.697284 | orchestrator | 2025-03-23 22:00:13.697298 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-03-23 22:00:13.697312 | orchestrator | Sunday 23 March 2025 21:59:31 +0000 (0:00:00.311) 0:01:50.928 ********** 2025-03-23 22:00:13.697326 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:00:13.697340 | orchestrator | 2025-03-23 22:00:13.697354 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-03-23 22:00:13.697369 | orchestrator | Sunday 23 March 2025 21:59:33 +0000 (0:00:02.478) 0:01:53.407 ********** 2025-03-23 22:00:13.697390 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:00:13.697404 | orchestrator | 2025-03-23 22:00:13.697418 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-03-23 22:00:13.697433 | orchestrator | 2025-03-23 22:00:13.697447 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-03-23 22:00:13.697461 | orchestrator | Sunday 23 March 2025 21:59:48 +0000 (0:00:15.385) 0:02:08.793 ********** 2025-03-23 22:00:13.697475 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:00:13.697489 | orchestrator | 2025-03-23 22:00:13.697503 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-03-23 22:00:13.697517 | orchestrator | Sunday 23 March 2025 21:59:49 +0000 (0:00:00.775) 0:02:09.568 ********** 2025-03-23 22:00:13.697531 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:00:13.697545 | orchestrator | 2025-03-23 22:00:13.697594 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-03-23 22:00:13.697615 | orchestrator | Sunday 23 March 2025 21:59:50 +0000 (0:00:00.302) 0:02:09.870 ********** 2025-03-23 22:00:13.697630 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:00:13.697644 | orchestrator | 2025-03-23 22:00:13.697658 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-03-23 22:00:13.697672 | orchestrator | Sunday 23 March 2025 21:59:52 +0000 (0:00:02.338) 0:02:12.209 ********** 2025-03-23 22:00:13.697686 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:00:13.697705 | orchestrator | 2025-03-23 22:00:13.697720 | orchestrator | PLAY [Apply rabbitmq post-configuration] *************************************** 2025-03-23 22:00:13.697734 | orchestrator | 2025-03-23 22:00:13.697747 | orchestrator | TASK [Include rabbitmq post-deploy.yml] **************************************** 2025-03-23 22:00:13.697761 | orchestrator | Sunday 23 March 2025 22:00:08 +0000 (0:00:15.981) 0:02:28.190 ********** 2025-03-23 22:00:13.697776 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:00:13.697789 | orchestrator | 2025-03-23 22:00:13.697803 | orchestrator | TASK [rabbitmq : Enable all stable feature flags] ****************************** 2025-03-23 22:00:13.697817 | orchestrator | Sunday 23 March 2025 22:00:10 +0000 (0:00:01.852) 0:02:30.043 ********** 2025-03-23 22:00:13.697831 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: 2025-03-23 22:00:13.697845 | orchestrator | enable_outward_rabbitmq_True 2025-03-23 22:00:13.697859 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: 2025-03-23 22:00:13.697873 | orchestrator | outward_rabbitmq_restart 2025-03-23 22:00:13.697888 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:00:13.697901 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:00:13.697915 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:00:13.697929 | orchestrator | 2025-03-23 22:00:13.697943 | orchestrator | PLAY [Apply role rabbitmq (outward)] ******************************************* 2025-03-23 22:00:13.697957 | orchestrator | skipping: no hosts matched 2025-03-23 22:00:13.697971 | orchestrator | 2025-03-23 22:00:13.697985 | orchestrator | PLAY [Restart rabbitmq (outward) services] ************************************* 2025-03-23 22:00:13.697999 | orchestrator | skipping: no hosts matched 2025-03-23 22:00:13.698013 | orchestrator | 2025-03-23 22:00:13.698069 | orchestrator | PLAY [Apply rabbitmq (outward) post-configuration] ***************************** 2025-03-23 22:00:13.698084 | orchestrator | skipping: no hosts matched 2025-03-23 22:00:13.698098 | orchestrator | 2025-03-23 22:00:13.698112 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 22:00:13.698127 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2025-03-23 22:00:13.698142 | orchestrator | testbed-node-0 : ok=23  changed=14  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2025-03-23 22:00:13.698157 | orchestrator | testbed-node-1 : ok=21  changed=14  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 22:00:13.698171 | orchestrator | testbed-node-2 : ok=21  changed=14  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-23 22:00:13.698194 | orchestrator | 2025-03-23 22:00:13.698208 | orchestrator | 2025-03-23 22:00:13.698222 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-23 22:00:13.698236 | orchestrator | Sunday 23 March 2025 22:00:13 +0000 (0:00:03.065) 0:02:33.108 ********** 2025-03-23 22:00:13.698250 | orchestrator | =============================================================================== 2025-03-23 22:00:13.698265 | orchestrator | rabbitmq : Waiting for rabbitmq to start ------------------------------- 90.61s 2025-03-23 22:00:13.698279 | orchestrator | rabbitmq : Restart rabbitmq container ----------------------------------- 6.97s 2025-03-23 22:00:13.698293 | orchestrator | rabbitmq : Running RabbitMQ bootstrap container ------------------------- 6.10s 2025-03-23 22:00:13.698307 | orchestrator | rabbitmq : Copying over rabbitmq.conf ----------------------------------- 4.13s 2025-03-23 22:00:13.698321 | orchestrator | Check RabbitMQ service -------------------------------------------------- 3.71s 2025-03-23 22:00:13.698336 | orchestrator | rabbitmq : Copying over rabbitmq-env.conf ------------------------------- 3.65s 2025-03-23 22:00:13.698350 | orchestrator | rabbitmq : Enable all stable feature flags ------------------------------ 3.06s 2025-03-23 22:00:13.698364 | orchestrator | rabbitmq : Get container facts ------------------------------------------ 2.85s 2025-03-23 22:00:13.698378 | orchestrator | rabbitmq : Copying over advanced.config --------------------------------- 2.59s 2025-03-23 22:00:13.698392 | orchestrator | rabbitmq : Copying over definitions.json -------------------------------- 2.51s 2025-03-23 22:00:13.698406 | orchestrator | rabbitmq : Get info on RabbitMQ container ------------------------------- 2.39s 2025-03-23 22:00:13.698419 | orchestrator | rabbitmq : Copying over enabled_plugins --------------------------------- 2.38s 2025-03-23 22:00:13.698434 | orchestrator | rabbitmq : Copying over erl_inetrc -------------------------------------- 2.33s 2025-03-23 22:00:13.698448 | orchestrator | rabbitmq : Copying over config.json files for services ------------------ 2.14s 2025-03-23 22:00:13.698462 | orchestrator | rabbitmq : include_tasks ------------------------------------------------ 2.13s 2025-03-23 22:00:13.698481 | orchestrator | rabbitmq : Check rabbitmq containers ------------------------------------ 2.01s 2025-03-23 22:00:13.698495 | orchestrator | Include rabbitmq post-deploy.yml ---------------------------------------- 1.86s 2025-03-23 22:00:13.698510 | orchestrator | rabbitmq : Ensuring config directories exist ---------------------------- 1.63s 2025-03-23 22:00:13.698524 | orchestrator | rabbitmq : Creating rabbitmq volume ------------------------------------- 1.14s 2025-03-23 22:00:13.698537 | orchestrator | rabbitmq : include_tasks ------------------------------------------------ 1.13s 2025-03-23 22:00:13.698575 | orchestrator | 2025-03-23 22:00:13 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:16.742245 | orchestrator | 2025-03-23 22:00:13 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:16.742357 | orchestrator | 2025-03-23 22:00:13 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:16.742392 | orchestrator | 2025-03-23 22:00:16 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:16.744730 | orchestrator | 2025-03-23 22:00:16 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:16.745757 | orchestrator | 2025-03-23 22:00:16 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:16.745782 | orchestrator | 2025-03-23 22:00:16 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:19.795228 | orchestrator | 2025-03-23 22:00:16 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:19.795390 | orchestrator | 2025-03-23 22:00:19 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:19.795578 | orchestrator | 2025-03-23 22:00:19 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:19.795732 | orchestrator | 2025-03-23 22:00:19 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:19.796026 | orchestrator | 2025-03-23 22:00:19 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:22.845935 | orchestrator | 2025-03-23 22:00:19 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:22.846114 | orchestrator | 2025-03-23 22:00:22 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:22.848862 | orchestrator | 2025-03-23 22:00:22 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:22.848897 | orchestrator | 2025-03-23 22:00:22 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:22.850455 | orchestrator | 2025-03-23 22:00:22 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:22.850659 | orchestrator | 2025-03-23 22:00:22 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:25.904186 | orchestrator | 2025-03-23 22:00:25 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:25.906222 | orchestrator | 2025-03-23 22:00:25 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:25.908719 | orchestrator | 2025-03-23 22:00:25 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:25.911364 | orchestrator | 2025-03-23 22:00:25 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:28.957405 | orchestrator | 2025-03-23 22:00:25 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:28.957606 | orchestrator | 2025-03-23 22:00:28 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:28.960109 | orchestrator | 2025-03-23 22:00:28 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:28.961083 | orchestrator | 2025-03-23 22:00:28 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:28.961114 | orchestrator | 2025-03-23 22:00:28 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:32.004387 | orchestrator | 2025-03-23 22:00:28 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:32.004581 | orchestrator | 2025-03-23 22:00:32 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:32.005526 | orchestrator | 2025-03-23 22:00:32 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:32.007260 | orchestrator | 2025-03-23 22:00:32 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:32.009615 | orchestrator | 2025-03-23 22:00:32 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:35.062923 | orchestrator | 2025-03-23 22:00:32 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:35.063091 | orchestrator | 2025-03-23 22:00:35 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:35.063718 | orchestrator | 2025-03-23 22:00:35 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:35.065804 | orchestrator | 2025-03-23 22:00:35 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:35.066895 | orchestrator | 2025-03-23 22:00:35 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:35.067325 | orchestrator | 2025-03-23 22:00:35 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:38.114738 | orchestrator | 2025-03-23 22:00:38 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:38.115506 | orchestrator | 2025-03-23 22:00:38 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:38.116611 | orchestrator | 2025-03-23 22:00:38 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:38.118195 | orchestrator | 2025-03-23 22:00:38 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:41.162316 | orchestrator | 2025-03-23 22:00:38 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:41.162461 | orchestrator | 2025-03-23 22:00:41 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:41.163303 | orchestrator | 2025-03-23 22:00:41 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:41.166456 | orchestrator | 2025-03-23 22:00:41 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:41.167765 | orchestrator | 2025-03-23 22:00:41 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:41.167870 | orchestrator | 2025-03-23 22:00:41 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:44.216999 | orchestrator | 2025-03-23 22:00:44 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:44.219067 | orchestrator | 2025-03-23 22:00:44 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:44.219106 | orchestrator | 2025-03-23 22:00:44 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:44.219130 | orchestrator | 2025-03-23 22:00:44 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:47.268313 | orchestrator | 2025-03-23 22:00:44 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:47.268469 | orchestrator | 2025-03-23 22:00:47 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:47.269007 | orchestrator | 2025-03-23 22:00:47 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:47.269040 | orchestrator | 2025-03-23 22:00:47 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:47.269760 | orchestrator | 2025-03-23 22:00:47 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:50.327454 | orchestrator | 2025-03-23 22:00:47 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:50.327680 | orchestrator | 2025-03-23 22:00:50 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:50.327940 | orchestrator | 2025-03-23 22:00:50 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:50.329359 | orchestrator | 2025-03-23 22:00:50 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:50.331066 | orchestrator | 2025-03-23 22:00:50 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:53.368886 | orchestrator | 2025-03-23 22:00:50 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:53.369052 | orchestrator | 2025-03-23 22:00:53 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:53.369605 | orchestrator | 2025-03-23 22:00:53 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:53.371229 | orchestrator | 2025-03-23 22:00:53 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:53.372204 | orchestrator | 2025-03-23 22:00:53 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:56.415322 | orchestrator | 2025-03-23 22:00:53 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:56.415601 | orchestrator | 2025-03-23 22:00:56 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:56.415702 | orchestrator | 2025-03-23 22:00:56 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:56.419801 | orchestrator | 2025-03-23 22:00:56 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:56.421222 | orchestrator | 2025-03-23 22:00:56 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:56.421390 | orchestrator | 2025-03-23 22:00:56 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:00:59.471834 | orchestrator | 2025-03-23 22:00:59 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:00:59.473249 | orchestrator | 2025-03-23 22:00:59 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:00:59.475997 | orchestrator | 2025-03-23 22:00:59 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:00:59.477677 | orchestrator | 2025-03-23 22:00:59 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:00:59.477881 | orchestrator | 2025-03-23 22:00:59 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:02.527538 | orchestrator | 2025-03-23 22:01:02 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:02.529024 | orchestrator | 2025-03-23 22:01:02 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:02.531947 | orchestrator | 2025-03-23 22:01:02 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:02.533941 | orchestrator | 2025-03-23 22:01:02 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:01:02.534202 | orchestrator | 2025-03-23 22:01:02 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:05.586875 | orchestrator | 2025-03-23 22:01:05 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:05.587766 | orchestrator | 2025-03-23 22:01:05 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:05.589811 | orchestrator | 2025-03-23 22:01:05 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:05.591616 | orchestrator | 2025-03-23 22:01:05 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:01:08.632332 | orchestrator | 2025-03-23 22:01:05 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:08.632491 | orchestrator | 2025-03-23 22:01:08 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:08.633838 | orchestrator | 2025-03-23 22:01:08 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:08.637277 | orchestrator | 2025-03-23 22:01:08 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:08.639800 | orchestrator | 2025-03-23 22:01:08 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:01:08.639916 | orchestrator | 2025-03-23 22:01:08 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:11.695989 | orchestrator | 2025-03-23 22:01:11 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:11.697073 | orchestrator | 2025-03-23 22:01:11 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:11.697822 | orchestrator | 2025-03-23 22:01:11 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:11.699068 | orchestrator | 2025-03-23 22:01:11 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:01:11.699412 | orchestrator | 2025-03-23 22:01:11 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:14.741500 | orchestrator | 2025-03-23 22:01:14 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:14.743374 | orchestrator | 2025-03-23 22:01:14 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:14.745626 | orchestrator | 2025-03-23 22:01:14 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:14.748272 | orchestrator | 2025-03-23 22:01:14 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:01:17.789960 | orchestrator | 2025-03-23 22:01:14 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:17.790194 | orchestrator | 2025-03-23 22:01:17 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:17.791861 | orchestrator | 2025-03-23 22:01:17 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:17.793434 | orchestrator | 2025-03-23 22:01:17 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:17.795262 | orchestrator | 2025-03-23 22:01:17 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:01:20.843281 | orchestrator | 2025-03-23 22:01:17 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:20.843441 | orchestrator | 2025-03-23 22:01:20 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:20.845104 | orchestrator | 2025-03-23 22:01:20 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:20.849722 | orchestrator | 2025-03-23 22:01:20 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:20.851888 | orchestrator | 2025-03-23 22:01:20 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:01:23.901705 | orchestrator | 2025-03-23 22:01:20 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:23.901869 | orchestrator | 2025-03-23 22:01:23 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:23.902750 | orchestrator | 2025-03-23 22:01:23 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:23.902786 | orchestrator | 2025-03-23 22:01:23 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:23.902942 | orchestrator | 2025-03-23 22:01:23 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:01:26.941168 | orchestrator | 2025-03-23 22:01:23 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:26.941314 | orchestrator | 2025-03-23 22:01:26 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:26.941397 | orchestrator | 2025-03-23 22:01:26 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:26.942490 | orchestrator | 2025-03-23 22:01:26 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:26.943133 | orchestrator | 2025-03-23 22:01:26 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state STARTED 2025-03-23 22:01:26.943630 | orchestrator | 2025-03-23 22:01:26 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:29.981580 | orchestrator | 2025-03-23 22:01:29 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:29.983280 | orchestrator | 2025-03-23 22:01:29 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:29.986573 | orchestrator | 2025-03-23 22:01:29 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:29.991131 | orchestrator | 2025-03-23 22:01:29.991221 | orchestrator | 2025-03-23 22:01:29.991240 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-23 22:01:29.991257 | orchestrator | 2025-03-23 22:01:29.991271 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-23 22:01:29.991286 | orchestrator | Sunday 23 March 2025 21:58:45 +0000 (0:00:00.330) 0:00:00.330 ********** 2025-03-23 22:01:29.991300 | orchestrator | ok: [testbed-node-3] 2025-03-23 22:01:29.991315 | orchestrator | ok: [testbed-node-4] 2025-03-23 22:01:29.991329 | orchestrator | ok: [testbed-node-5] 2025-03-23 22:01:29.991343 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.991357 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.991371 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.991385 | orchestrator | 2025-03-23 22:01:29.991400 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-23 22:01:29.991414 | orchestrator | Sunday 23 March 2025 21:58:46 +0000 (0:00:01.252) 0:00:01.582 ********** 2025-03-23 22:01:29.991428 | orchestrator | ok: [testbed-node-3] => (item=enable_ovn_True) 2025-03-23 22:01:29.991443 | orchestrator | ok: [testbed-node-4] => (item=enable_ovn_True) 2025-03-23 22:01:29.991457 | orchestrator | ok: [testbed-node-5] => (item=enable_ovn_True) 2025-03-23 22:01:29.991471 | orchestrator | ok: [testbed-node-0] => (item=enable_ovn_True) 2025-03-23 22:01:29.991485 | orchestrator | ok: [testbed-node-1] => (item=enable_ovn_True) 2025-03-23 22:01:29.991499 | orchestrator | ok: [testbed-node-2] => (item=enable_ovn_True) 2025-03-23 22:01:29.991513 | orchestrator | 2025-03-23 22:01:29.991527 | orchestrator | PLAY [Apply role ovn-controller] *********************************************** 2025-03-23 22:01:29.991541 | orchestrator | 2025-03-23 22:01:29.991597 | orchestrator | TASK [ovn-controller : include_tasks] ****************************************** 2025-03-23 22:01:29.991614 | orchestrator | Sunday 23 March 2025 21:58:48 +0000 (0:00:01.981) 0:00:03.564 ********** 2025-03-23 22:01:29.991629 | orchestrator | included: /ansible/roles/ovn-controller/tasks/deploy.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:01:29.991645 | orchestrator | 2025-03-23 22:01:29.991660 | orchestrator | TASK [ovn-controller : Ensuring config directories exist] ********************** 2025-03-23 22:01:29.991674 | orchestrator | Sunday 23 March 2025 21:58:50 +0000 (0:00:01.942) 0:00:05.507 ********** 2025-03-23 22:01:29.991689 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991707 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991721 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991758 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991773 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991814 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991830 | orchestrator | 2025-03-23 22:01:29.991844 | orchestrator | TASK [ovn-controller : Copying over config.json files for services] ************ 2025-03-23 22:01:29.991858 | orchestrator | Sunday 23 March 2025 21:58:52 +0000 (0:00:01.732) 0:00:07.239 ********** 2025-03-23 22:01:29.991878 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991893 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991908 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991922 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991937 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991958 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.991973 | orchestrator | 2025-03-23 22:01:29.991987 | orchestrator | TASK [ovn-controller : Ensuring systemd override directory exists] ************* 2025-03-23 22:01:29.992001 | orchestrator | Sunday 23 March 2025 21:58:55 +0000 (0:00:03.115) 0:00:10.355 ********** 2025-03-23 22:01:29.992016 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992030 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992059 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992075 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992089 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992103 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992118 | orchestrator | 2025-03-23 22:01:29.992132 | orchestrator | TASK [ovn-controller : Copying over systemd override] ************************** 2025-03-23 22:01:29.992146 | orchestrator | Sunday 23 March 2025 21:58:57 +0000 (0:00:01.423) 0:00:11.779 ********** 2025-03-23 22:01:29.992160 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992181 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992196 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992210 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992224 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992250 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992265 | orchestrator | 2025-03-23 22:01:29.992279 | orchestrator | TASK [ovn-controller : Check ovn-controller containers] ************************ 2025-03-23 22:01:29.992294 | orchestrator | Sunday 23 March 2025 21:58:59 +0000 (0:00:02.274) 0:00:14.053 ********** 2025-03-23 22:01:29.992308 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992322 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992337 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992357 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992372 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992386 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.992401 | orchestrator | 2025-03-23 22:01:29.992415 | orchestrator | TASK [ovn-controller : Create br-int bridge on OpenvSwitch] ******************** 2025-03-23 22:01:29.992429 | orchestrator | Sunday 23 March 2025 21:59:01 +0000 (0:00:02.500) 0:00:16.553 ********** 2025-03-23 22:01:29.992443 | orchestrator | changed: [testbed-node-4] 2025-03-23 22:01:29.992458 | orchestrator | changed: [testbed-node-3] 2025-03-23 22:01:29.992472 | orchestrator | changed: [testbed-node-5] 2025-03-23 22:01:29.992486 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:01:29.992500 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:01:29.992514 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:01:29.992527 | orchestrator | 2025-03-23 22:01:29.992541 | orchestrator | TASK [ovn-controller : Configure OVN in OVSDB] ********************************* 2025-03-23 22:01:29.992609 | orchestrator | Sunday 23 March 2025 21:59:05 +0000 (0:00:03.853) 0:00:20.407 ********** 2025-03-23 22:01:29.992626 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.13'}) 2025-03-23 22:01:29.992640 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.14'}) 2025-03-23 22:01:29.992654 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.10'}) 2025-03-23 22:01:29.992675 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.15'}) 2025-03-23 22:01:29.992690 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.11'}) 2025-03-23 22:01:29.992703 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.12'}) 2025-03-23 22:01:29.992715 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-23 22:01:29.992728 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-23 22:01:29.992740 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-23 22:01:29.992753 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-23 22:01:29.992765 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-23 22:01:29.992783 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-23 22:01:29.992796 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-23 22:01:29.992817 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-23 22:01:29.992830 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-23 22:01:29.992843 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-23 22:01:29.992855 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-23 22:01:29.992868 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-23 22:01:29.992881 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-23 22:01:29.992894 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-23 22:01:29.992907 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-23 22:01:29.992920 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-23 22:01:29.992932 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-23 22:01:29.992945 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-23 22:01:29.992957 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-23 22:01:29.992969 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-23 22:01:29.992982 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-23 22:01:29.992994 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-23 22:01:29.993007 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-23 22:01:29.993019 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-23 22:01:29.993032 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-23 22:01:29.993044 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-23 22:01:29.993057 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-23 22:01:29.993070 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-23 22:01:29.993082 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-23 22:01:29.993095 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-23 22:01:29.993107 | orchestrator | ok: [testbed-node-3] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'absent'}) 2025-03-23 22:01:29.993120 | orchestrator | ok: [testbed-node-4] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'absent'}) 2025-03-23 22:01:29.993133 | orchestrator | ok: [testbed-node-5] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'absent'}) 2025-03-23 22:01:29.993146 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'present'}) 2025-03-23 22:01:29.993164 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'present'}) 2025-03-23 22:01:29.993177 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:89:18:56', 'state': 'present'}) 2025-03-23 22:01:29.993195 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'present'}) 2025-03-23 22:01:29.993208 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:2f:fa:44', 'state': 'present'}) 2025-03-23 22:01:29.993221 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:71:3a:c3', 'state': 'present'}) 2025-03-23 22:01:29.993233 | orchestrator | ok: [testbed-node-1] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:33:12:50', 'state': 'absent'}) 2025-03-23 22:01:29.993246 | orchestrator | ok: [testbed-node-2] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:29:4a:9b', 'state': 'absent'}) 2025-03-23 22:01:29.993259 | orchestrator | ok: [testbed-node-3] => (item={'name': 'ovn-cms-options', 'value': '', 'state': 'absent'}) 2025-03-23 22:01:29.993271 | orchestrator | ok: [testbed-node-0] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:52:c1:40', 'state': 'absent'}) 2025-03-23 22:01:29.993288 | orchestrator | ok: [testbed-node-4] => (item={'name': 'ovn-cms-options', 'value': '', 'state': 'absent'}) 2025-03-23 22:01:29.993301 | orchestrator | ok: [testbed-node-5] => (item={'name': 'ovn-cms-options', 'value': '', 'state': 'absent'}) 2025-03-23 22:01:29.993314 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-cms-options', 'value': 'enable-chassis-as-gw,availability-zones=nova', 'state': 'present'}) 2025-03-23 22:01:29.993327 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-cms-options', 'value': 'enable-chassis-as-gw,availability-zones=nova', 'state': 'present'}) 2025-03-23 22:01:29.993339 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-cms-options', 'value': 'enable-chassis-as-gw,availability-zones=nova', 'state': 'present'}) 2025-03-23 22:01:29.993352 | orchestrator | 2025-03-23 22:01:29.993364 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-23 22:01:29.993377 | orchestrator | Sunday 23 March 2025 21:59:27 +0000 (0:00:22.120) 0:00:42.527 ********** 2025-03-23 22:01:29.993389 | orchestrator | 2025-03-23 22:01:29.993402 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-23 22:01:29.993414 | orchestrator | Sunday 23 March 2025 21:59:27 +0000 (0:00:00.066) 0:00:42.594 ********** 2025-03-23 22:01:29.993427 | orchestrator | 2025-03-23 22:01:29.993439 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-23 22:01:29.993451 | orchestrator | Sunday 23 March 2025 21:59:28 +0000 (0:00:00.269) 0:00:42.863 ********** 2025-03-23 22:01:29.993464 | orchestrator | 2025-03-23 22:01:29.993476 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-23 22:01:29.993489 | orchestrator | Sunday 23 March 2025 21:59:28 +0000 (0:00:00.106) 0:00:42.970 ********** 2025-03-23 22:01:29.993501 | orchestrator | 2025-03-23 22:01:29.993514 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-23 22:01:29.993526 | orchestrator | Sunday 23 March 2025 21:59:28 +0000 (0:00:00.088) 0:00:43.059 ********** 2025-03-23 22:01:29.993538 | orchestrator | 2025-03-23 22:01:29.993568 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-23 22:01:29.993582 | orchestrator | Sunday 23 March 2025 21:59:28 +0000 (0:00:00.141) 0:00:43.200 ********** 2025-03-23 22:01:29.993594 | orchestrator | 2025-03-23 22:01:29.993607 | orchestrator | RUNNING HANDLER [ovn-controller : Reload systemd config] *********************** 2025-03-23 22:01:29.993619 | orchestrator | Sunday 23 March 2025 21:59:29 +0000 (0:00:00.581) 0:00:43.782 ********** 2025-03-23 22:01:29.993632 | orchestrator | ok: [testbed-node-4] 2025-03-23 22:01:29.993644 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.993656 | orchestrator | ok: [testbed-node-5] 2025-03-23 22:01:29.993669 | orchestrator | ok: [testbed-node-3] 2025-03-23 22:01:29.993681 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.993699 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.993712 | orchestrator | 2025-03-23 22:01:29.993724 | orchestrator | RUNNING HANDLER [ovn-controller : Restart ovn-controller container] ************ 2025-03-23 22:01:29.993737 | orchestrator | Sunday 23 March 2025 21:59:31 +0000 (0:00:02.705) 0:00:46.488 ********** 2025-03-23 22:01:29.993749 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:01:29.993761 | orchestrator | changed: [testbed-node-5] 2025-03-23 22:01:29.993774 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:01:29.993786 | orchestrator | changed: [testbed-node-3] 2025-03-23 22:01:29.993798 | orchestrator | changed: [testbed-node-4] 2025-03-23 22:01:29.993811 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:01:29.993823 | orchestrator | 2025-03-23 22:01:29.993835 | orchestrator | PLAY [Apply role ovn-db] ******************************************************* 2025-03-23 22:01:29.993848 | orchestrator | 2025-03-23 22:01:29.993860 | orchestrator | TASK [ovn-db : include_tasks] ************************************************** 2025-03-23 22:01:29.993872 | orchestrator | Sunday 23 March 2025 21:59:51 +0000 (0:00:19.527) 0:01:06.015 ********** 2025-03-23 22:01:29.993885 | orchestrator | included: /ansible/roles/ovn-db/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:01:29.993898 | orchestrator | 2025-03-23 22:01:29.993910 | orchestrator | TASK [ovn-db : include_tasks] ************************************************** 2025-03-23 22:01:29.993923 | orchestrator | Sunday 23 March 2025 21:59:52 +0000 (0:00:01.062) 0:01:07.077 ********** 2025-03-23 22:01:29.993935 | orchestrator | included: /ansible/roles/ovn-db/tasks/lookup_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:01:29.993948 | orchestrator | 2025-03-23 22:01:29.993966 | orchestrator | TASK [ovn-db : Checking for any existing OVN DB container volumes] ************* 2025-03-23 22:01:29.993979 | orchestrator | Sunday 23 March 2025 21:59:53 +0000 (0:00:01.035) 0:01:08.112 ********** 2025-03-23 22:01:29.993991 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.994004 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.994079 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.994096 | orchestrator | 2025-03-23 22:01:29.994109 | orchestrator | TASK [ovn-db : Divide hosts by their OVN NB volume availability] *************** 2025-03-23 22:01:29.994126 | orchestrator | Sunday 23 March 2025 21:59:54 +0000 (0:00:01.146) 0:01:09.259 ********** 2025-03-23 22:01:29.994139 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.994151 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.994164 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.994176 | orchestrator | 2025-03-23 22:01:29.994189 | orchestrator | TASK [ovn-db : Divide hosts by their OVN SB volume availability] *************** 2025-03-23 22:01:29.994201 | orchestrator | Sunday 23 March 2025 21:59:55 +0000 (0:00:00.519) 0:01:09.778 ********** 2025-03-23 22:01:29.994214 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.994226 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.994238 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.994250 | orchestrator | 2025-03-23 22:01:29.994263 | orchestrator | TASK [ovn-db : Establish whether the OVN NB cluster has already existed] ******* 2025-03-23 22:01:29.994275 | orchestrator | Sunday 23 March 2025 21:59:55 +0000 (0:00:00.566) 0:01:10.345 ********** 2025-03-23 22:01:29.994287 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.994299 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.994312 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.994324 | orchestrator | 2025-03-23 22:01:29.994336 | orchestrator | TASK [ovn-db : Establish whether the OVN SB cluster has already existed] ******* 2025-03-23 22:01:29.994349 | orchestrator | Sunday 23 March 2025 21:59:56 +0000 (0:00:00.470) 0:01:10.816 ********** 2025-03-23 22:01:29.994361 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.994373 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.994386 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.994398 | orchestrator | 2025-03-23 22:01:29.994410 | orchestrator | TASK [ovn-db : Check if running on all OVN NB DB hosts] ************************ 2025-03-23 22:01:29.994423 | orchestrator | Sunday 23 March 2025 21:59:56 +0000 (0:00:00.401) 0:01:11.217 ********** 2025-03-23 22:01:29.994435 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.994456 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.994469 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.994482 | orchestrator | 2025-03-23 22:01:29.994494 | orchestrator | TASK [ovn-db : Check OVN NB service port liveness] ***************************** 2025-03-23 22:01:29.994507 | orchestrator | Sunday 23 March 2025 21:59:57 +0000 (0:00:00.537) 0:01:11.755 ********** 2025-03-23 22:01:29.994519 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.994585 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.994601 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.994614 | orchestrator | 2025-03-23 22:01:29.994627 | orchestrator | TASK [ovn-db : Divide hosts by their OVN NB service port liveness] ************* 2025-03-23 22:01:29.994639 | orchestrator | Sunday 23 March 2025 21:59:57 +0000 (0:00:00.578) 0:01:12.334 ********** 2025-03-23 22:01:29.994652 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.994664 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.994677 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.994689 | orchestrator | 2025-03-23 22:01:29.994702 | orchestrator | TASK [ovn-db : Get OVN NB database information] ******************************** 2025-03-23 22:01:29.994714 | orchestrator | Sunday 23 March 2025 21:59:58 +0000 (0:00:00.475) 0:01:12.810 ********** 2025-03-23 22:01:29.994726 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.994739 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.994751 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.994764 | orchestrator | 2025-03-23 22:01:29.994776 | orchestrator | TASK [ovn-db : Divide hosts by their OVN NB leader/follower role] ************** 2025-03-23 22:01:29.994789 | orchestrator | Sunday 23 March 2025 21:59:58 +0000 (0:00:00.325) 0:01:13.135 ********** 2025-03-23 22:01:29.994801 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.994814 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.994826 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.994838 | orchestrator | 2025-03-23 22:01:29.994851 | orchestrator | TASK [ovn-db : Fail on existing OVN NB cluster with no leader] ***************** 2025-03-23 22:01:29.994863 | orchestrator | Sunday 23 March 2025 21:59:58 +0000 (0:00:00.553) 0:01:13.688 ********** 2025-03-23 22:01:29.994875 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.994888 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.994900 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.994912 | orchestrator | 2025-03-23 22:01:29.994925 | orchestrator | TASK [ovn-db : Check if running on all OVN SB DB hosts] ************************ 2025-03-23 22:01:29.994938 | orchestrator | Sunday 23 March 2025 22:00:00 +0000 (0:00:01.196) 0:01:14.885 ********** 2025-03-23 22:01:29.994950 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.994963 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.994975 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.994987 | orchestrator | 2025-03-23 22:01:29.995000 | orchestrator | TASK [ovn-db : Check OVN SB service port liveness] ***************************** 2025-03-23 22:01:29.995012 | orchestrator | Sunday 23 March 2025 22:00:00 +0000 (0:00:00.834) 0:01:15.719 ********** 2025-03-23 22:01:29.995024 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.995037 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.995049 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.995061 | orchestrator | 2025-03-23 22:01:29.995074 | orchestrator | TASK [ovn-db : Divide hosts by their OVN SB service port liveness] ************* 2025-03-23 22:01:29.995086 | orchestrator | Sunday 23 March 2025 22:00:01 +0000 (0:00:00.554) 0:01:16.274 ********** 2025-03-23 22:01:29.995099 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.995111 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.995124 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.995136 | orchestrator | 2025-03-23 22:01:29.995148 | orchestrator | TASK [ovn-db : Get OVN SB database information] ******************************** 2025-03-23 22:01:29.995161 | orchestrator | Sunday 23 March 2025 22:00:02 +0000 (0:00:00.951) 0:01:17.225 ********** 2025-03-23 22:01:29.995173 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.995192 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.995205 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.995217 | orchestrator | 2025-03-23 22:01:29.995237 | orchestrator | TASK [ovn-db : Divide hosts by their OVN SB leader/follower role] ************** 2025-03-23 22:01:29.995250 | orchestrator | Sunday 23 March 2025 22:00:03 +0000 (0:00:00.574) 0:01:17.800 ********** 2025-03-23 22:01:29.995262 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.995275 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.995287 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.995299 | orchestrator | 2025-03-23 22:01:29.995312 | orchestrator | TASK [ovn-db : Fail on existing OVN SB cluster with no leader] ***************** 2025-03-23 22:01:29.995324 | orchestrator | Sunday 23 March 2025 22:00:03 +0000 (0:00:00.860) 0:01:18.660 ********** 2025-03-23 22:01:29.995337 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.995349 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.995361 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.995374 | orchestrator | 2025-03-23 22:01:29.995386 | orchestrator | TASK [ovn-db : include_tasks] ************************************************** 2025-03-23 22:01:29.995402 | orchestrator | Sunday 23 March 2025 22:00:04 +0000 (0:00:00.711) 0:01:19.372 ********** 2025-03-23 22:01:29.995419 | orchestrator | included: /ansible/roles/ovn-db/tasks/bootstrap-initial.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:01:29.995432 | orchestrator | 2025-03-23 22:01:29.995444 | orchestrator | TASK [ovn-db : Set bootstrap args fact for NB (new cluster)] ******************* 2025-03-23 22:01:29.995456 | orchestrator | Sunday 23 March 2025 22:00:06 +0000 (0:00:01.437) 0:01:20.810 ********** 2025-03-23 22:01:29.995469 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.995481 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.995494 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.995506 | orchestrator | 2025-03-23 22:01:29.995519 | orchestrator | TASK [ovn-db : Set bootstrap args fact for SB (new cluster)] ******************* 2025-03-23 22:01:29.995531 | orchestrator | Sunday 23 March 2025 22:00:06 +0000 (0:00:00.659) 0:01:21.469 ********** 2025-03-23 22:01:29.995544 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.995604 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.995618 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.995631 | orchestrator | 2025-03-23 22:01:29.995643 | orchestrator | TASK [ovn-db : Check NB cluster status] **************************************** 2025-03-23 22:01:29.995656 | orchestrator | Sunday 23 March 2025 22:00:07 +0000 (0:00:00.832) 0:01:22.302 ********** 2025-03-23 22:01:29.995668 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.995681 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.995693 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.995706 | orchestrator | 2025-03-23 22:01:29.995718 | orchestrator | TASK [ovn-db : Check SB cluster status] **************************************** 2025-03-23 22:01:29.995731 | orchestrator | Sunday 23 March 2025 22:00:08 +0000 (0:00:01.281) 0:01:23.584 ********** 2025-03-23 22:01:29.995743 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.995755 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.995768 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.995780 | orchestrator | 2025-03-23 22:01:29.995793 | orchestrator | TASK [ovn-db : Remove an old node with the same ip address as the new node in NB DB] *** 2025-03-23 22:01:29.995805 | orchestrator | Sunday 23 March 2025 22:00:10 +0000 (0:00:01.638) 0:01:25.223 ********** 2025-03-23 22:01:29.995818 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.995830 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.995843 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.995855 | orchestrator | 2025-03-23 22:01:29.995867 | orchestrator | TASK [ovn-db : Remove an old node with the same ip address as the new node in SB DB] *** 2025-03-23 22:01:29.995880 | orchestrator | Sunday 23 March 2025 22:00:11 +0000 (0:00:00.651) 0:01:25.874 ********** 2025-03-23 22:01:29.995892 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.995905 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.995917 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.995940 | orchestrator | 2025-03-23 22:01:29.995952 | orchestrator | TASK [ovn-db : Set bootstrap args fact for NB (new member)] ******************** 2025-03-23 22:01:29.995965 | orchestrator | Sunday 23 March 2025 22:00:12 +0000 (0:00:01.003) 0:01:26.878 ********** 2025-03-23 22:01:29.995977 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.995994 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.996007 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.996020 | orchestrator | 2025-03-23 22:01:29.996033 | orchestrator | TASK [ovn-db : Set bootstrap args fact for SB (new member)] ******************** 2025-03-23 22:01:29.996045 | orchestrator | Sunday 23 March 2025 22:00:12 +0000 (0:00:00.643) 0:01:27.521 ********** 2025-03-23 22:01:29.996057 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.996070 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.996082 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.996095 | orchestrator | 2025-03-23 22:01:29.996107 | orchestrator | TASK [ovn-db : Ensuring config directories exist] ****************************** 2025-03-23 22:01:29.996119 | orchestrator | Sunday 23 March 2025 22:00:13 +0000 (0:00:00.680) 0:01:28.202 ********** 2025-03-23 22:01:29.996129 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996142 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996157 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996170 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996184 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996195 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996205 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996221 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996231 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996242 | orchestrator | 2025-03-23 22:01:29.996252 | orchestrator | TASK [ovn-db : Copying over config.json files for services] ******************** 2025-03-23 22:01:29.996263 | orchestrator | Sunday 23 March 2025 22:00:15 +0000 (0:00:01.776) 0:01:29.979 ********** 2025-03-23 22:01:29.996273 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996283 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996294 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996308 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996323 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996334 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996344 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996359 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996370 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996380 | orchestrator | 2025-03-23 22:01:29.996390 | orchestrator | TASK [ovn-db : Check ovn containers] ******************************************* 2025-03-23 22:01:29.996401 | orchestrator | Sunday 23 March 2025 22:00:21 +0000 (0:00:05.932) 0:01:35.912 ********** 2025-03-23 22:01:29.996411 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996425 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996435 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996451 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996462 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996472 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996483 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996498 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996513 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.996523 | orchestrator | 2025-03-23 22:01:29.996533 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-23 22:01:29.996544 | orchestrator | Sunday 23 March 2025 22:00:23 +0000 (0:00:02.746) 0:01:38.659 ********** 2025-03-23 22:01:29.996567 | orchestrator | 2025-03-23 22:01:29.996578 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-23 22:01:29.996589 | orchestrator | Sunday 23 March 2025 22:00:23 +0000 (0:00:00.074) 0:01:38.733 ********** 2025-03-23 22:01:29.996599 | orchestrator | 2025-03-23 22:01:29.996609 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-23 22:01:29.996619 | orchestrator | Sunday 23 March 2025 22:00:24 +0000 (0:00:00.069) 0:01:38.803 ********** 2025-03-23 22:01:29.996630 | orchestrator | 2025-03-23 22:01:29.996640 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-nb-db container] ************************* 2025-03-23 22:01:29.996650 | orchestrator | Sunday 23 March 2025 22:00:24 +0000 (0:00:00.427) 0:01:39.231 ********** 2025-03-23 22:01:29.996660 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:01:29.996670 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:01:29.996680 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:01:29.996691 | orchestrator | 2025-03-23 22:01:29.996701 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-sb-db container] ************************* 2025-03-23 22:01:29.996714 | orchestrator | Sunday 23 March 2025 22:00:32 +0000 (0:00:07.777) 0:01:47.008 ********** 2025-03-23 22:01:29.996725 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:01:29.996735 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:01:29.996745 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:01:29.996755 | orchestrator | 2025-03-23 22:01:29.996766 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-northd container] ************************ 2025-03-23 22:01:29.996776 | orchestrator | Sunday 23 March 2025 22:00:35 +0000 (0:00:03.070) 0:01:50.079 ********** 2025-03-23 22:01:29.996786 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:01:29.996796 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:01:29.996806 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:01:29.996816 | orchestrator | 2025-03-23 22:01:29.996826 | orchestrator | TASK [ovn-db : Wait for leader election] *************************************** 2025-03-23 22:01:29.996836 | orchestrator | Sunday 23 March 2025 22:00:43 +0000 (0:00:08.022) 0:01:58.102 ********** 2025-03-23 22:01:29.996846 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.996857 | orchestrator | 2025-03-23 22:01:29.996867 | orchestrator | TASK [ovn-db : Get OVN_Northbound cluster leader] ****************************** 2025-03-23 22:01:29.996877 | orchestrator | Sunday 23 March 2025 22:00:43 +0000 (0:00:00.149) 0:01:58.252 ********** 2025-03-23 22:01:29.996887 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.996897 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.996907 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.996922 | orchestrator | 2025-03-23 22:01:29.996937 | orchestrator | TASK [ovn-db : Configure OVN NB connection settings] *************************** 2025-03-23 22:01:29.996948 | orchestrator | Sunday 23 March 2025 22:00:44 +0000 (0:00:01.352) 0:01:59.605 ********** 2025-03-23 22:01:29.996958 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.996968 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.996978 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:01:29.996989 | orchestrator | 2025-03-23 22:01:29.996999 | orchestrator | TASK [ovn-db : Get OVN_Southbound cluster leader] ****************************** 2025-03-23 22:01:29.997009 | orchestrator | Sunday 23 March 2025 22:00:45 +0000 (0:00:00.740) 0:02:00.345 ********** 2025-03-23 22:01:29.997019 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.997029 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.997039 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.997050 | orchestrator | 2025-03-23 22:01:29.997060 | orchestrator | TASK [ovn-db : Configure OVN SB connection settings] *************************** 2025-03-23 22:01:29.997070 | orchestrator | Sunday 23 March 2025 22:00:46 +0000 (0:00:00.992) 0:02:01.337 ********** 2025-03-23 22:01:29.997080 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.997090 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.997100 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:01:29.997110 | orchestrator | 2025-03-23 22:01:29.997120 | orchestrator | TASK [ovn-db : Wait for ovn-nb-db] ********************************************* 2025-03-23 22:01:29.997130 | orchestrator | Sunday 23 March 2025 22:00:47 +0000 (0:00:00.704) 0:02:02.042 ********** 2025-03-23 22:01:29.997141 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.997151 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.997161 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.997171 | orchestrator | 2025-03-23 22:01:29.997181 | orchestrator | TASK [ovn-db : Wait for ovn-sb-db] ********************************************* 2025-03-23 22:01:29.997192 | orchestrator | Sunday 23 March 2025 22:00:48 +0000 (0:00:01.200) 0:02:03.243 ********** 2025-03-23 22:01:29.997202 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.997212 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.997222 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.997232 | orchestrator | 2025-03-23 22:01:29.997242 | orchestrator | TASK [ovn-db : Unset bootstrap args fact] ************************************** 2025-03-23 22:01:29.997253 | orchestrator | Sunday 23 March 2025 22:00:49 +0000 (0:00:00.830) 0:02:04.073 ********** 2025-03-23 22:01:29.997263 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.997273 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.997283 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.997293 | orchestrator | 2025-03-23 22:01:29.997303 | orchestrator | TASK [ovn-db : Ensuring config directories exist] ****************************** 2025-03-23 22:01:29.997313 | orchestrator | Sunday 23 March 2025 22:00:49 +0000 (0:00:00.559) 0:02:04.632 ********** 2025-03-23 22:01:29.997324 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997334 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997345 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997363 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997374 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997385 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997400 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997410 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997421 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997431 | orchestrator | 2025-03-23 22:01:29.997442 | orchestrator | TASK [ovn-db : Copying over config.json files for services] ******************** 2025-03-23 22:01:29.997452 | orchestrator | Sunday 23 March 2025 22:00:51 +0000 (0:00:01.660) 0:02:06.292 ********** 2025-03-23 22:01:29.997462 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997473 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997483 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997502 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997513 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997524 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997541 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997564 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997575 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997586 | orchestrator | 2025-03-23 22:01:29.997596 | orchestrator | TASK [ovn-db : Check ovn containers] ******************************************* 2025-03-23 22:01:29.997607 | orchestrator | Sunday 23 March 2025 22:00:55 +0000 (0:00:04.307) 0:02:10.600 ********** 2025-03-23 22:01:29.997617 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997628 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997638 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997654 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997668 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997679 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997693 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997708 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997719 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-23 22:01:29.997729 | orchestrator | 2025-03-23 22:01:29.997740 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-23 22:01:29.997750 | orchestrator | Sunday 23 March 2025 22:00:59 +0000 (0:00:03.339) 0:02:13.940 ********** 2025-03-23 22:01:29.997760 | orchestrator | 2025-03-23 22:01:29.997771 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-23 22:01:29.997781 | orchestrator | Sunday 23 March 2025 22:00:59 +0000 (0:00:00.281) 0:02:14.222 ********** 2025-03-23 22:01:29.997791 | orchestrator | 2025-03-23 22:01:29.997801 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-23 22:01:29.997811 | orchestrator | Sunday 23 March 2025 22:00:59 +0000 (0:00:00.079) 0:02:14.302 ********** 2025-03-23 22:01:29.997821 | orchestrator | 2025-03-23 22:01:29.997832 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-nb-db container] ************************* 2025-03-23 22:01:29.997842 | orchestrator | Sunday 23 March 2025 22:00:59 +0000 (0:00:00.057) 0:02:14.359 ********** 2025-03-23 22:01:29.997852 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:01:29.997862 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:01:29.997872 | orchestrator | 2025-03-23 22:01:29.997882 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-sb-db container] ************************* 2025-03-23 22:01:29.997893 | orchestrator | Sunday 23 March 2025 22:01:06 +0000 (0:00:06.745) 0:02:21.104 ********** 2025-03-23 22:01:29.997907 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:01:29.997918 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:01:29.997928 | orchestrator | 2025-03-23 22:01:29.997938 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-northd container] ************************ 2025-03-23 22:01:29.997948 | orchestrator | Sunday 23 March 2025 22:01:12 +0000 (0:00:06.439) 0:02:27.543 ********** 2025-03-23 22:01:29.997959 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:01:29.997969 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:01:29.997979 | orchestrator | 2025-03-23 22:01:29.997989 | orchestrator | TASK [ovn-db : Wait for leader election] *************************************** 2025-03-23 22:01:29.998000 | orchestrator | Sunday 23 March 2025 22:01:19 +0000 (0:00:06.625) 0:02:34.168 ********** 2025-03-23 22:01:29.998010 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:01:29.998044 | orchestrator | 2025-03-23 22:01:29.998056 | orchestrator | TASK [ovn-db : Get OVN_Northbound cluster leader] ****************************** 2025-03-23 22:01:29.998066 | orchestrator | Sunday 23 March 2025 22:01:19 +0000 (0:00:00.391) 0:02:34.560 ********** 2025-03-23 22:01:29.998077 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.998087 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.998097 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.998107 | orchestrator | 2025-03-23 22:01:29.998118 | orchestrator | TASK [ovn-db : Configure OVN NB connection settings] *************************** 2025-03-23 22:01:29.998127 | orchestrator | Sunday 23 March 2025 22:01:20 +0000 (0:00:00.817) 0:02:35.378 ********** 2025-03-23 22:01:29.998138 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.998148 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.998158 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:01:29.998168 | orchestrator | 2025-03-23 22:01:29.998178 | orchestrator | TASK [ovn-db : Get OVN_Southbound cluster leader] ****************************** 2025-03-23 22:01:29.998188 | orchestrator | Sunday 23 March 2025 22:01:21 +0000 (0:00:00.698) 0:02:36.076 ********** 2025-03-23 22:01:29.998198 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.998208 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.998219 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.998235 | orchestrator | 2025-03-23 22:01:29.998246 | orchestrator | TASK [ovn-db : Configure OVN SB connection settings] *************************** 2025-03-23 22:01:29.998256 | orchestrator | Sunday 23 March 2025 22:01:22 +0000 (0:00:00.990) 0:02:37.067 ********** 2025-03-23 22:01:29.998267 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:01:29.998278 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:01:29.998288 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:01:29.998299 | orchestrator | 2025-03-23 22:01:29.998309 | orchestrator | TASK [ovn-db : Wait for ovn-nb-db] ********************************************* 2025-03-23 22:01:29.998319 | orchestrator | Sunday 23 March 2025 22:01:23 +0000 (0:00:01.025) 0:02:38.092 ********** 2025-03-23 22:01:29.998329 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.998339 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.998349 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.998359 | orchestrator | 2025-03-23 22:01:29.998370 | orchestrator | TASK [ovn-db : Wait for ovn-sb-db] ********************************************* 2025-03-23 22:01:29.998380 | orchestrator | Sunday 23 March 2025 22:01:24 +0000 (0:00:01.161) 0:02:39.254 ********** 2025-03-23 22:01:29.998390 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:01:29.998400 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:01:29.998410 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:01:29.998420 | orchestrator | 2025-03-23 22:01:29.998430 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 22:01:29.998441 | orchestrator | testbed-node-0 : ok=44  changed=18  unreachable=0 failed=0 skipped=20  rescued=0 ignored=0 2025-03-23 22:01:29.998451 | orchestrator | testbed-node-1 : ok=43  changed=18  unreachable=0 failed=0 skipped=22  rescued=0 ignored=0 2025-03-23 22:01:29.998466 | orchestrator | testbed-node-2 : ok=43  changed=18  unreachable=0 failed=0 skipped=22  rescued=0 ignored=0 2025-03-23 22:01:33.027854 | orchestrator | testbed-node-3 : ok=12  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 22:01:33.027994 | orchestrator | testbed-node-4 : ok=12  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 22:01:33.028013 | orchestrator | testbed-node-5 : ok=12  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-23 22:01:33.028028 | orchestrator | 2025-03-23 22:01:33.028043 | orchestrator | 2025-03-23 22:01:33.028058 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-23 22:01:33.028073 | orchestrator | Sunday 23 March 2025 22:01:26 +0000 (0:00:02.220) 0:02:41.474 ********** 2025-03-23 22:01:33.028088 | orchestrator | =============================================================================== 2025-03-23 22:01:33.028103 | orchestrator | ovn-controller : Configure OVN in OVSDB -------------------------------- 22.12s 2025-03-23 22:01:33.028117 | orchestrator | ovn-controller : Restart ovn-controller container ---------------------- 19.53s 2025-03-23 22:01:33.028131 | orchestrator | ovn-db : Restart ovn-northd container ---------------------------------- 14.65s 2025-03-23 22:01:33.028145 | orchestrator | ovn-db : Restart ovn-nb-db container ----------------------------------- 14.52s 2025-03-23 22:01:33.028159 | orchestrator | ovn-db : Restart ovn-sb-db container ------------------------------------ 9.51s 2025-03-23 22:01:33.028173 | orchestrator | ovn-db : Copying over config.json files for services -------------------- 5.93s 2025-03-23 22:01:33.028187 | orchestrator | ovn-db : Copying over config.json files for services -------------------- 4.31s 2025-03-23 22:01:33.028201 | orchestrator | ovn-controller : Create br-int bridge on OpenvSwitch -------------------- 3.85s 2025-03-23 22:01:33.028221 | orchestrator | ovn-db : Check ovn containers ------------------------------------------- 3.34s 2025-03-23 22:01:33.028235 | orchestrator | ovn-controller : Copying over config.json files for services ------------ 3.12s 2025-03-23 22:01:33.028250 | orchestrator | ovn-db : Check ovn containers ------------------------------------------- 2.75s 2025-03-23 22:01:33.028264 | orchestrator | ovn-controller : Reload systemd config ---------------------------------- 2.71s 2025-03-23 22:01:33.028277 | orchestrator | ovn-controller : Check ovn-controller containers ------------------------ 2.50s 2025-03-23 22:01:33.028291 | orchestrator | ovn-controller : Copying over systemd override -------------------------- 2.27s 2025-03-23 22:01:33.028305 | orchestrator | ovn-db : Wait for ovn-sb-db --------------------------------------------- 2.22s 2025-03-23 22:01:33.028319 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.98s 2025-03-23 22:01:33.028333 | orchestrator | ovn-controller : include_tasks ------------------------------------------ 1.94s 2025-03-23 22:01:33.028347 | orchestrator | ovn-db : Ensuring config directories exist ------------------------------ 1.78s 2025-03-23 22:01:33.028361 | orchestrator | ovn-controller : Ensuring config directories exist ---------------------- 1.73s 2025-03-23 22:01:33.028375 | orchestrator | ovn-db : Ensuring config directories exist ------------------------------ 1.66s 2025-03-23 22:01:33.028391 | orchestrator | 2025-03-23 22:01:29 | INFO  | Task 61ff2397-ccea-429b-b92a-df8186dc3a5d is in state SUCCESS 2025-03-23 22:01:33.028408 | orchestrator | 2025-03-23 22:01:29 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:33.028439 | orchestrator | 2025-03-23 22:01:33 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:33.030444 | orchestrator | 2025-03-23 22:01:33 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:33.032852 | orchestrator | 2025-03-23 22:01:33 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:36.075311 | orchestrator | 2025-03-23 22:01:33 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:36.075463 | orchestrator | 2025-03-23 22:01:36 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:36.076953 | orchestrator | 2025-03-23 22:01:36 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:36.076996 | orchestrator | 2025-03-23 22:01:36 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:36.077018 | orchestrator | 2025-03-23 22:01:36 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:39.161322 | orchestrator | 2025-03-23 22:01:39 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:39.164600 | orchestrator | 2025-03-23 22:01:39 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:39.165676 | orchestrator | 2025-03-23 22:01:39 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:42.207694 | orchestrator | 2025-03-23 22:01:39 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:42.207818 | orchestrator | 2025-03-23 22:01:42 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:42.208359 | orchestrator | 2025-03-23 22:01:42 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:42.209363 | orchestrator | 2025-03-23 22:01:42 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:45.250378 | orchestrator | 2025-03-23 22:01:42 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:45.250513 | orchestrator | 2025-03-23 22:01:45 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:45.251917 | orchestrator | 2025-03-23 22:01:45 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:45.256045 | orchestrator | 2025-03-23 22:01:45 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:48.306315 | orchestrator | 2025-03-23 22:01:45 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:48.306440 | orchestrator | 2025-03-23 22:01:48 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:48.307883 | orchestrator | 2025-03-23 22:01:48 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:48.310506 | orchestrator | 2025-03-23 22:01:48 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:48.310658 | orchestrator | 2025-03-23 22:01:48 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:51.356467 | orchestrator | 2025-03-23 22:01:51 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:51.357273 | orchestrator | 2025-03-23 22:01:51 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:51.359139 | orchestrator | 2025-03-23 22:01:51 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:54.403657 | orchestrator | 2025-03-23 22:01:51 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:54.403793 | orchestrator | 2025-03-23 22:01:54 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:54.407735 | orchestrator | 2025-03-23 22:01:54 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:54.409681 | orchestrator | 2025-03-23 22:01:54 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:01:57.485001 | orchestrator | 2025-03-23 22:01:54 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:01:57.485138 | orchestrator | 2025-03-23 22:01:57 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:01:57.485882 | orchestrator | 2025-03-23 22:01:57 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:01:57.487205 | orchestrator | 2025-03-23 22:01:57 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:00.534735 | orchestrator | 2025-03-23 22:01:57 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:00.534862 | orchestrator | 2025-03-23 22:02:00 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:00.537349 | orchestrator | 2025-03-23 22:02:00 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:00.538459 | orchestrator | 2025-03-23 22:02:00 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:03.595928 | orchestrator | 2025-03-23 22:02:00 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:03.596071 | orchestrator | 2025-03-23 22:02:03 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:03.596253 | orchestrator | 2025-03-23 22:02:03 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:03.597243 | orchestrator | 2025-03-23 22:02:03 | INFO  | Task c949ed33-9a01-4dfe-8b21-20760eb1d432 is in state STARTED 2025-03-23 22:02:03.600514 | orchestrator | 2025-03-23 22:02:03 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:06.659256 | orchestrator | 2025-03-23 22:02:03 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:06.659396 | orchestrator | 2025-03-23 22:02:06 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:06.660743 | orchestrator | 2025-03-23 22:02:06 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:06.664946 | orchestrator | 2025-03-23 22:02:06 | INFO  | Task c949ed33-9a01-4dfe-8b21-20760eb1d432 is in state STARTED 2025-03-23 22:02:06.665720 | orchestrator | 2025-03-23 22:02:06 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:09.717995 | orchestrator | 2025-03-23 22:02:06 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:09.718318 | orchestrator | 2025-03-23 22:02:09 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:09.723697 | orchestrator | 2025-03-23 22:02:09 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:09.723739 | orchestrator | 2025-03-23 22:02:09 | INFO  | Task c949ed33-9a01-4dfe-8b21-20760eb1d432 is in state STARTED 2025-03-23 22:02:09.733818 | orchestrator | 2025-03-23 22:02:09 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:12.803181 | orchestrator | 2025-03-23 22:02:09 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:12.803315 | orchestrator | 2025-03-23 22:02:12 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:12.804905 | orchestrator | 2025-03-23 22:02:12 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:12.806723 | orchestrator | 2025-03-23 22:02:12 | INFO  | Task c949ed33-9a01-4dfe-8b21-20760eb1d432 is in state STARTED 2025-03-23 22:02:12.808125 | orchestrator | 2025-03-23 22:02:12 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:12.808244 | orchestrator | 2025-03-23 22:02:12 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:15.876600 | orchestrator | 2025-03-23 22:02:15 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:15.878896 | orchestrator | 2025-03-23 22:02:15 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:15.882159 | orchestrator | 2025-03-23 22:02:15 | INFO  | Task c949ed33-9a01-4dfe-8b21-20760eb1d432 is in state STARTED 2025-03-23 22:02:15.884211 | orchestrator | 2025-03-23 22:02:15 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:15.884517 | orchestrator | 2025-03-23 22:02:15 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:18.929752 | orchestrator | 2025-03-23 22:02:18 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:18.932369 | orchestrator | 2025-03-23 22:02:18 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:18.934248 | orchestrator | 2025-03-23 22:02:18 | INFO  | Task c949ed33-9a01-4dfe-8b21-20760eb1d432 is in state SUCCESS 2025-03-23 22:02:18.936067 | orchestrator | 2025-03-23 22:02:18 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:21.992447 | orchestrator | 2025-03-23 22:02:18 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:21.992641 | orchestrator | 2025-03-23 22:02:21 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:21.994282 | orchestrator | 2025-03-23 22:02:21 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:21.996354 | orchestrator | 2025-03-23 22:02:21 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:25.062740 | orchestrator | 2025-03-23 22:02:21 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:25.062866 | orchestrator | 2025-03-23 22:02:25 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:25.066121 | orchestrator | 2025-03-23 22:02:25 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:25.066393 | orchestrator | 2025-03-23 22:02:25 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:25.066910 | orchestrator | 2025-03-23 22:02:25 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:28.131618 | orchestrator | 2025-03-23 22:02:28 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:28.136132 | orchestrator | 2025-03-23 22:02:28 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:28.136999 | orchestrator | 2025-03-23 22:02:28 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:28.137035 | orchestrator | 2025-03-23 22:02:28 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:31.195348 | orchestrator | 2025-03-23 22:02:31 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:31.197534 | orchestrator | 2025-03-23 22:02:31 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:31.197720 | orchestrator | 2025-03-23 22:02:31 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:34.253139 | orchestrator | 2025-03-23 22:02:31 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:34.253277 | orchestrator | 2025-03-23 22:02:34 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:34.257630 | orchestrator | 2025-03-23 22:02:34 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:34.257669 | orchestrator | 2025-03-23 22:02:34 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:37.313880 | orchestrator | 2025-03-23 22:02:34 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:37.314099 | orchestrator | 2025-03-23 22:02:37 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:37.315242 | orchestrator | 2025-03-23 22:02:37 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:37.317508 | orchestrator | 2025-03-23 22:02:37 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:37.318278 | orchestrator | 2025-03-23 22:02:37 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:40.369663 | orchestrator | 2025-03-23 22:02:40 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:40.370457 | orchestrator | 2025-03-23 22:02:40 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:40.371610 | orchestrator | 2025-03-23 22:02:40 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:43.423522 | orchestrator | 2025-03-23 22:02:40 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:43.423699 | orchestrator | 2025-03-23 22:02:43 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:43.425048 | orchestrator | 2025-03-23 22:02:43 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:43.426613 | orchestrator | 2025-03-23 22:02:43 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:46.475957 | orchestrator | 2025-03-23 22:02:43 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:46.476104 | orchestrator | 2025-03-23 22:02:46 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:46.477167 | orchestrator | 2025-03-23 22:02:46 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:46.478490 | orchestrator | 2025-03-23 22:02:46 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:49.539911 | orchestrator | 2025-03-23 22:02:46 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:49.540040 | orchestrator | 2025-03-23 22:02:49 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:49.542820 | orchestrator | 2025-03-23 22:02:49 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:49.542857 | orchestrator | 2025-03-23 22:02:49 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:52.587126 | orchestrator | 2025-03-23 22:02:49 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:52.587268 | orchestrator | 2025-03-23 22:02:52 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:52.587784 | orchestrator | 2025-03-23 22:02:52 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:52.588380 | orchestrator | 2025-03-23 22:02:52 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:55.653509 | orchestrator | 2025-03-23 22:02:52 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:55.653666 | orchestrator | 2025-03-23 22:02:55 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:55.662383 | orchestrator | 2025-03-23 22:02:55 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:58.717706 | orchestrator | 2025-03-23 22:02:55 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:02:58.717831 | orchestrator | 2025-03-23 22:02:55 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:02:58.717869 | orchestrator | 2025-03-23 22:02:58 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:02:58.719357 | orchestrator | 2025-03-23 22:02:58 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:02:58.720488 | orchestrator | 2025-03-23 22:02:58 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:01.795536 | orchestrator | 2025-03-23 22:02:58 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:01.795723 | orchestrator | 2025-03-23 22:03:01 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:01.798536 | orchestrator | 2025-03-23 22:03:01 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:01.800066 | orchestrator | 2025-03-23 22:03:01 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:01.800201 | orchestrator | 2025-03-23 22:03:01 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:04.863971 | orchestrator | 2025-03-23 22:03:04 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:04.869389 | orchestrator | 2025-03-23 22:03:04 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:04.875076 | orchestrator | 2025-03-23 22:03:04 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:07.928791 | orchestrator | 2025-03-23 22:03:04 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:07.928941 | orchestrator | 2025-03-23 22:03:07 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:07.929465 | orchestrator | 2025-03-23 22:03:07 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:07.930429 | orchestrator | 2025-03-23 22:03:07 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:10.967449 | orchestrator | 2025-03-23 22:03:07 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:10.967627 | orchestrator | 2025-03-23 22:03:10 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:10.968937 | orchestrator | 2025-03-23 22:03:10 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:10.970826 | orchestrator | 2025-03-23 22:03:10 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:14.014009 | orchestrator | 2025-03-23 22:03:10 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:14.014204 | orchestrator | 2025-03-23 22:03:14 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:14.018062 | orchestrator | 2025-03-23 22:03:14 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:14.018776 | orchestrator | 2025-03-23 22:03:14 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:17.064030 | orchestrator | 2025-03-23 22:03:14 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:17.064157 | orchestrator | 2025-03-23 22:03:17 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:17.067277 | orchestrator | 2025-03-23 22:03:17 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:17.067387 | orchestrator | 2025-03-23 22:03:17 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:20.115223 | orchestrator | 2025-03-23 22:03:17 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:20.115348 | orchestrator | 2025-03-23 22:03:20 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:20.115803 | orchestrator | 2025-03-23 22:03:20 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:20.120135 | orchestrator | 2025-03-23 22:03:20 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:23.171972 | orchestrator | 2025-03-23 22:03:20 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:23.172109 | orchestrator | 2025-03-23 22:03:23 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:23.172205 | orchestrator | 2025-03-23 22:03:23 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:23.172229 | orchestrator | 2025-03-23 22:03:23 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:26.234514 | orchestrator | 2025-03-23 22:03:23 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:26.234677 | orchestrator | 2025-03-23 22:03:26 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:26.237036 | orchestrator | 2025-03-23 22:03:26 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:26.239317 | orchestrator | 2025-03-23 22:03:26 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:26.239896 | orchestrator | 2025-03-23 22:03:26 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:29.297869 | orchestrator | 2025-03-23 22:03:29 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:29.298234 | orchestrator | 2025-03-23 22:03:29 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:29.298280 | orchestrator | 2025-03-23 22:03:29 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:32.334635 | orchestrator | 2025-03-23 22:03:29 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:32.334805 | orchestrator | 2025-03-23 22:03:32 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:32.336621 | orchestrator | 2025-03-23 22:03:32 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:32.337956 | orchestrator | 2025-03-23 22:03:32 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:32.338188 | orchestrator | 2025-03-23 22:03:32 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:35.385829 | orchestrator | 2025-03-23 22:03:35 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:35.387800 | orchestrator | 2025-03-23 22:03:35 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:35.389635 | orchestrator | 2025-03-23 22:03:35 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:38.431260 | orchestrator | 2025-03-23 22:03:35 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:38.431403 | orchestrator | 2025-03-23 22:03:38 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:38.431919 | orchestrator | 2025-03-23 22:03:38 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:38.437913 | orchestrator | 2025-03-23 22:03:38 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:41.474973 | orchestrator | 2025-03-23 22:03:38 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:41.475100 | orchestrator | 2025-03-23 22:03:41 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:41.476911 | orchestrator | 2025-03-23 22:03:41 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:41.483394 | orchestrator | 2025-03-23 22:03:41 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:44.531766 | orchestrator | 2025-03-23 22:03:41 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:44.531898 | orchestrator | 2025-03-23 22:03:44 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:44.532601 | orchestrator | 2025-03-23 22:03:44 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:44.532638 | orchestrator | 2025-03-23 22:03:44 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:47.584309 | orchestrator | 2025-03-23 22:03:44 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:47.584453 | orchestrator | 2025-03-23 22:03:47 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:47.586063 | orchestrator | 2025-03-23 22:03:47 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:47.588620 | orchestrator | 2025-03-23 22:03:47 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:50.672155 | orchestrator | 2025-03-23 22:03:47 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:50.672280 | orchestrator | 2025-03-23 22:03:50 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:50.672866 | orchestrator | 2025-03-23 22:03:50 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:50.672906 | orchestrator | 2025-03-23 22:03:50 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:53.726240 | orchestrator | 2025-03-23 22:03:50 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:53.726384 | orchestrator | 2025-03-23 22:03:53 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:53.726895 | orchestrator | 2025-03-23 22:03:53 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:53.727537 | orchestrator | 2025-03-23 22:03:53 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:56.785780 | orchestrator | 2025-03-23 22:03:53 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:56.785906 | orchestrator | 2025-03-23 22:03:56 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:56.788205 | orchestrator | 2025-03-23 22:03:56 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:56.791976 | orchestrator | 2025-03-23 22:03:56 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:03:56.792372 | orchestrator | 2025-03-23 22:03:56 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:03:59.843213 | orchestrator | 2025-03-23 22:03:59 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:03:59.844322 | orchestrator | 2025-03-23 22:03:59 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:03:59.845736 | orchestrator | 2025-03-23 22:03:59 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:02.907450 | orchestrator | 2025-03-23 22:03:59 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:02.907620 | orchestrator | 2025-03-23 22:04:02 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:02.908967 | orchestrator | 2025-03-23 22:04:02 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:02.911325 | orchestrator | 2025-03-23 22:04:02 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:05.966485 | orchestrator | 2025-03-23 22:04:02 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:05.966649 | orchestrator | 2025-03-23 22:04:05 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:05.969071 | orchestrator | 2025-03-23 22:04:05 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:05.971807 | orchestrator | 2025-03-23 22:04:05 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:09.016786 | orchestrator | 2025-03-23 22:04:05 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:09.016900 | orchestrator | 2025-03-23 22:04:09 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:09.018269 | orchestrator | 2025-03-23 22:04:09 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:09.019334 | orchestrator | 2025-03-23 22:04:09 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:12.075033 | orchestrator | 2025-03-23 22:04:09 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:12.075164 | orchestrator | 2025-03-23 22:04:12 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:12.076396 | orchestrator | 2025-03-23 22:04:12 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:12.078209 | orchestrator | 2025-03-23 22:04:12 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:12.078527 | orchestrator | 2025-03-23 22:04:12 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:15.143340 | orchestrator | 2025-03-23 22:04:15 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:15.146871 | orchestrator | 2025-03-23 22:04:15 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:15.149248 | orchestrator | 2025-03-23 22:04:15 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:18.212618 | orchestrator | 2025-03-23 22:04:15 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:18.212743 | orchestrator | 2025-03-23 22:04:18 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:18.215206 | orchestrator | 2025-03-23 22:04:18 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:18.217624 | orchestrator | 2025-03-23 22:04:18 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:18.218100 | orchestrator | 2025-03-23 22:04:18 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:21.257087 | orchestrator | 2025-03-23 22:04:21 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:24.290152 | orchestrator | 2025-03-23 22:04:21 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:24.290259 | orchestrator | 2025-03-23 22:04:21 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:24.290279 | orchestrator | 2025-03-23 22:04:21 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:24.290311 | orchestrator | 2025-03-23 22:04:24 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:24.291155 | orchestrator | 2025-03-23 22:04:24 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:24.292368 | orchestrator | 2025-03-23 22:04:24 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:27.354859 | orchestrator | 2025-03-23 22:04:24 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:27.355071 | orchestrator | 2025-03-23 22:04:27 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:27.355167 | orchestrator | 2025-03-23 22:04:27 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:27.355190 | orchestrator | 2025-03-23 22:04:27 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:30.404620 | orchestrator | 2025-03-23 22:04:27 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:30.404808 | orchestrator | 2025-03-23 22:04:30 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:30.404897 | orchestrator | 2025-03-23 22:04:30 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:30.405477 | orchestrator | 2025-03-23 22:04:30 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:33.457144 | orchestrator | 2025-03-23 22:04:30 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:33.457275 | orchestrator | 2025-03-23 22:04:33 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:33.457740 | orchestrator | 2025-03-23 22:04:33 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:33.457774 | orchestrator | 2025-03-23 22:04:33 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:36.506633 | orchestrator | 2025-03-23 22:04:33 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:36.506783 | orchestrator | 2025-03-23 22:04:36 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:36.508862 | orchestrator | 2025-03-23 22:04:36 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:36.510363 | orchestrator | 2025-03-23 22:04:36 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:39.565106 | orchestrator | 2025-03-23 22:04:36 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:39.565253 | orchestrator | 2025-03-23 22:04:39 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:39.568176 | orchestrator | 2025-03-23 22:04:39 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:39.570421 | orchestrator | 2025-03-23 22:04:39 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:42.633204 | orchestrator | 2025-03-23 22:04:39 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:42.633342 | orchestrator | 2025-03-23 22:04:42 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:42.635436 | orchestrator | 2025-03-23 22:04:42 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:42.638783 | orchestrator | 2025-03-23 22:04:42 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:45.689872 | orchestrator | 2025-03-23 22:04:42 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:45.690004 | orchestrator | 2025-03-23 22:04:45 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:45.691274 | orchestrator | 2025-03-23 22:04:45 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:45.694204 | orchestrator | 2025-03-23 22:04:45 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:45.695122 | orchestrator | 2025-03-23 22:04:45 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:48.755376 | orchestrator | 2025-03-23 22:04:48 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:48.756533 | orchestrator | 2025-03-23 22:04:48 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:48.756618 | orchestrator | 2025-03-23 22:04:48 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:51.803510 | orchestrator | 2025-03-23 22:04:48 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:51.803697 | orchestrator | 2025-03-23 22:04:51 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:51.810340 | orchestrator | 2025-03-23 22:04:51 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:51.810381 | orchestrator | 2025-03-23 22:04:51 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:54.861486 | orchestrator | 2025-03-23 22:04:51 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:54.861648 | orchestrator | 2025-03-23 22:04:54 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:54.863458 | orchestrator | 2025-03-23 22:04:54 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:54.865744 | orchestrator | 2025-03-23 22:04:54 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:57.912388 | orchestrator | 2025-03-23 22:04:54 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:04:57.912523 | orchestrator | 2025-03-23 22:04:57 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:04:57.915103 | orchestrator | 2025-03-23 22:04:57 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:04:57.917640 | orchestrator | 2025-03-23 22:04:57 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:04:57.918263 | orchestrator | 2025-03-23 22:04:57 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:00.967034 | orchestrator | 2025-03-23 22:05:00 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:00.967857 | orchestrator | 2025-03-23 22:05:00 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:00.969165 | orchestrator | 2025-03-23 22:05:00 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:00.969278 | orchestrator | 2025-03-23 22:05:00 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:04.023715 | orchestrator | 2025-03-23 22:05:04 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:04.024669 | orchestrator | 2025-03-23 22:05:04 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:04.026776 | orchestrator | 2025-03-23 22:05:04 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:07.071926 | orchestrator | 2025-03-23 22:05:04 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:07.072056 | orchestrator | 2025-03-23 22:05:07 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:10.130599 | orchestrator | 2025-03-23 22:05:07 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:10.130686 | orchestrator | 2025-03-23 22:05:07 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:10.130695 | orchestrator | 2025-03-23 22:05:07 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:10.130732 | orchestrator | 2025-03-23 22:05:10 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:10.133932 | orchestrator | 2025-03-23 22:05:10 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:10.134570 | orchestrator | 2025-03-23 22:05:10 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:10.134943 | orchestrator | 2025-03-23 22:05:10 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:13.195100 | orchestrator | 2025-03-23 22:05:13 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:13.198325 | orchestrator | 2025-03-23 22:05:13 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:13.200607 | orchestrator | 2025-03-23 22:05:13 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:16.248096 | orchestrator | 2025-03-23 22:05:13 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:16.248243 | orchestrator | 2025-03-23 22:05:16 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:16.248523 | orchestrator | 2025-03-23 22:05:16 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:16.248624 | orchestrator | 2025-03-23 22:05:16 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:19.286099 | orchestrator | 2025-03-23 22:05:16 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:19.286260 | orchestrator | 2025-03-23 22:05:19 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:19.289987 | orchestrator | 2025-03-23 22:05:19 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:19.291638 | orchestrator | 2025-03-23 22:05:19 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:19.291907 | orchestrator | 2025-03-23 22:05:19 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:22.348289 | orchestrator | 2025-03-23 22:05:22 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:22.350220 | orchestrator | 2025-03-23 22:05:22 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:22.352544 | orchestrator | 2025-03-23 22:05:22 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:22.352659 | orchestrator | 2025-03-23 22:05:22 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:25.393690 | orchestrator | 2025-03-23 22:05:25 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:25.394154 | orchestrator | 2025-03-23 22:05:25 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:25.396033 | orchestrator | 2025-03-23 22:05:25 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:28.461458 | orchestrator | 2025-03-23 22:05:25 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:28.461638 | orchestrator | 2025-03-23 22:05:28 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:31.521762 | orchestrator | 2025-03-23 22:05:28 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:31.521881 | orchestrator | 2025-03-23 22:05:28 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:31.521901 | orchestrator | 2025-03-23 22:05:28 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:31.521935 | orchestrator | 2025-03-23 22:05:31 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:31.523723 | orchestrator | 2025-03-23 22:05:31 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:31.525184 | orchestrator | 2025-03-23 22:05:31 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:34.571539 | orchestrator | 2025-03-23 22:05:31 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:34.571720 | orchestrator | 2025-03-23 22:05:34 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:34.573905 | orchestrator | 2025-03-23 22:05:34 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:34.575741 | orchestrator | 2025-03-23 22:05:34 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:34.576265 | orchestrator | 2025-03-23 22:05:34 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:37.632540 | orchestrator | 2025-03-23 22:05:37 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:37.634140 | orchestrator | 2025-03-23 22:05:37 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:37.634185 | orchestrator | 2025-03-23 22:05:37 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:37.634382 | orchestrator | 2025-03-23 22:05:37 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:40.681694 | orchestrator | 2025-03-23 22:05:40 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:40.686800 | orchestrator | 2025-03-23 22:05:40 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:40.691123 | orchestrator | 2025-03-23 22:05:40 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:43.750868 | orchestrator | 2025-03-23 22:05:40 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:43.751026 | orchestrator | 2025-03-23 22:05:43 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:43.751511 | orchestrator | 2025-03-23 22:05:43 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:43.751982 | orchestrator | 2025-03-23 22:05:43 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:46.798934 | orchestrator | 2025-03-23 22:05:43 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:46.799073 | orchestrator | 2025-03-23 22:05:46 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:46.800018 | orchestrator | 2025-03-23 22:05:46 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:46.801764 | orchestrator | 2025-03-23 22:05:46 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:49.852310 | orchestrator | 2025-03-23 22:05:46 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:49.852449 | orchestrator | 2025-03-23 22:05:49 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:49.854263 | orchestrator | 2025-03-23 22:05:49 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:49.857030 | orchestrator | 2025-03-23 22:05:49 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:49.857225 | orchestrator | 2025-03-23 22:05:49 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:52.903840 | orchestrator | 2025-03-23 22:05:52 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:52.905840 | orchestrator | 2025-03-23 22:05:52 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:52.907050 | orchestrator | 2025-03-23 22:05:52 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state STARTED 2025-03-23 22:05:52.907431 | orchestrator | 2025-03-23 22:05:52 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:55.959263 | orchestrator | 2025-03-23 22:05:55 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:55.960818 | orchestrator | 2025-03-23 22:05:55 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:55.969259 | orchestrator | 2025-03-23 22:05:55 | INFO  | Task 6f5ec00a-9f2e-40eb-a1d2-26a43473a853 is in state SUCCESS 2025-03-23 22:05:55.972528 | orchestrator | 2025-03-23 22:05:55.972642 | orchestrator | None 2025-03-23 22:05:55.972660 | orchestrator | 2025-03-23 22:05:55.972675 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-23 22:05:55.972690 | orchestrator | 2025-03-23 22:05:55.972704 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-23 22:05:55.972719 | orchestrator | Sunday 23 March 2025 21:57:07 +0000 (0:00:00.621) 0:00:00.621 ********** 2025-03-23 22:05:55.972733 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.972749 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.972763 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.972777 | orchestrator | 2025-03-23 22:05:55.972792 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-23 22:05:55.972806 | orchestrator | Sunday 23 March 2025 21:57:08 +0000 (0:00:00.541) 0:00:01.162 ********** 2025-03-23 22:05:55.972821 | orchestrator | ok: [testbed-node-0] => (item=enable_loadbalancer_True) 2025-03-23 22:05:55.972836 | orchestrator | ok: [testbed-node-1] => (item=enable_loadbalancer_True) 2025-03-23 22:05:55.972863 | orchestrator | ok: [testbed-node-2] => (item=enable_loadbalancer_True) 2025-03-23 22:05:55.972879 | orchestrator | 2025-03-23 22:05:55.972893 | orchestrator | PLAY [Apply role loadbalancer] ************************************************* 2025-03-23 22:05:55.972908 | orchestrator | 2025-03-23 22:05:55.972974 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2025-03-23 22:05:55.972992 | orchestrator | Sunday 23 March 2025 21:57:08 +0000 (0:00:00.634) 0:00:01.797 ********** 2025-03-23 22:05:55.973007 | orchestrator | included: /ansible/roles/loadbalancer/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.973022 | orchestrator | 2025-03-23 22:05:55.973037 | orchestrator | TASK [loadbalancer : Check IPv6 support] *************************************** 2025-03-23 22:05:55.973052 | orchestrator | Sunday 23 March 2025 21:57:10 +0000 (0:00:01.952) 0:00:03.749 ********** 2025-03-23 22:05:55.973067 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.973081 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.973096 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.973110 | orchestrator | 2025-03-23 22:05:55.973125 | orchestrator | TASK [Setting sysctl values] *************************************************** 2025-03-23 22:05:55.973139 | orchestrator | Sunday 23 March 2025 21:57:12 +0000 (0:00:01.493) 0:00:05.242 ********** 2025-03-23 22:05:55.973154 | orchestrator | included: sysctl for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.973169 | orchestrator | 2025-03-23 22:05:55.973184 | orchestrator | TASK [sysctl : Check IPv6 support] ********************************************* 2025-03-23 22:05:55.973199 | orchestrator | Sunday 23 March 2025 21:57:13 +0000 (0:00:01.394) 0:00:06.637 ********** 2025-03-23 22:05:55.973214 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.973229 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.973243 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.973258 | orchestrator | 2025-03-23 22:05:55.973272 | orchestrator | TASK [sysctl : Setting sysctl values] ****************************************** 2025-03-23 22:05:55.973287 | orchestrator | Sunday 23 March 2025 21:57:15 +0000 (0:00:01.811) 0:00:08.448 ********** 2025-03-23 22:05:55.973301 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-03-23 22:05:55.973337 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-03-23 22:05:55.973353 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-03-23 22:05:55.973368 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-03-23 22:05:55.973383 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-03-23 22:05:55.973398 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-03-23 22:05:55.973413 | orchestrator | ok: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-03-23 22:05:55.973430 | orchestrator | ok: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-03-23 22:05:55.973445 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-03-23 22:05:55.973460 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-03-23 22:05:55.973474 | orchestrator | ok: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-03-23 22:05:55.973489 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-03-23 22:05:55.973504 | orchestrator | 2025-03-23 22:05:55.973519 | orchestrator | TASK [module-load : Load modules] ********************************************** 2025-03-23 22:05:55.973534 | orchestrator | Sunday 23 March 2025 21:57:20 +0000 (0:00:04.486) 0:00:12.935 ********** 2025-03-23 22:05:55.973549 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2025-03-23 22:05:55.973595 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2025-03-23 22:05:55.973612 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2025-03-23 22:05:55.973626 | orchestrator | 2025-03-23 22:05:55.973641 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2025-03-23 22:05:55.973656 | orchestrator | Sunday 23 March 2025 21:57:21 +0000 (0:00:01.571) 0:00:14.506 ********** 2025-03-23 22:05:55.973671 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2025-03-23 22:05:55.973694 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2025-03-23 22:05:55.973710 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2025-03-23 22:05:55.973724 | orchestrator | 2025-03-23 22:05:55.973739 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2025-03-23 22:05:55.973754 | orchestrator | Sunday 23 March 2025 21:57:23 +0000 (0:00:02.080) 0:00:16.587 ********** 2025-03-23 22:05:55.973769 | orchestrator | skipping: [testbed-node-0] => (item=ip_vs)  2025-03-23 22:05:55.973784 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.973811 | orchestrator | skipping: [testbed-node-1] => (item=ip_vs)  2025-03-23 22:05:55.973828 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.973843 | orchestrator | skipping: [testbed-node-2] => (item=ip_vs)  2025-03-23 22:05:55.973857 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.973872 | orchestrator | 2025-03-23 22:05:55.973886 | orchestrator | TASK [loadbalancer : Ensuring config directories exist] ************************ 2025-03-23 22:05:55.973901 | orchestrator | Sunday 23 March 2025 21:57:24 +0000 (0:00:01.283) 0:00:17.870 ********** 2025-03-23 22:05:55.973917 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.973939 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.973965 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.973981 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.973998 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.974112 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.974133 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.974149 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.974173 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.974188 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.974203 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.974218 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.974233 | orchestrator | 2025-03-23 22:05:55.974247 | orchestrator | TASK [loadbalancer : Ensuring haproxy service config subdir exists] ************ 2025-03-23 22:05:55.974261 | orchestrator | Sunday 23 March 2025 21:57:28 +0000 (0:00:03.564) 0:00:21.434 ********** 2025-03-23 22:05:55.974276 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.974290 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.974304 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.974318 | orchestrator | 2025-03-23 22:05:55.974337 | orchestrator | TASK [loadbalancer : Ensuring proxysql service config subdirectories exist] **** 2025-03-23 22:05:55.974352 | orchestrator | Sunday 23 March 2025 21:57:32 +0000 (0:00:04.326) 0:00:25.760 ********** 2025-03-23 22:05:55.974366 | orchestrator | changed: [testbed-node-1] => (item=users) 2025-03-23 22:05:55.974380 | orchestrator | changed: [testbed-node-0] => (item=users) 2025-03-23 22:05:55.974394 | orchestrator | changed: [testbed-node-2] => (item=users) 2025-03-23 22:05:55.974408 | orchestrator | changed: [testbed-node-0] => (item=rules) 2025-03-23 22:05:55.974429 | orchestrator | changed: [testbed-node-1] => (item=rules) 2025-03-23 22:05:55.974443 | orchestrator | changed: [testbed-node-2] => (item=rules) 2025-03-23 22:05:55.974457 | orchestrator | 2025-03-23 22:05:55.974471 | orchestrator | TASK [loadbalancer : Ensuring keepalived checks subdir exists] ***************** 2025-03-23 22:05:55.974486 | orchestrator | Sunday 23 March 2025 21:57:40 +0000 (0:00:07.534) 0:00:33.295 ********** 2025-03-23 22:05:55.974500 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.974514 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.974527 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.974541 | orchestrator | 2025-03-23 22:05:55.974603 | orchestrator | TASK [loadbalancer : Remove mariadb.cfg if proxysql enabled] ******************* 2025-03-23 22:05:55.974620 | orchestrator | Sunday 23 March 2025 21:57:42 +0000 (0:00:02.104) 0:00:35.400 ********** 2025-03-23 22:05:55.974635 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.974649 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.974664 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.974678 | orchestrator | 2025-03-23 22:05:55.974692 | orchestrator | TASK [loadbalancer : Removing checks for services which are disabled] ********** 2025-03-23 22:05:55.974706 | orchestrator | Sunday 23 March 2025 21:57:45 +0000 (0:00:02.659) 0:00:38.059 ********** 2025-03-23 22:05:55.974721 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-03-23 22:05:55.974736 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-03-23 22:05:55.974751 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-23 22:05:55.974765 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-03-23 22:05:55.974803 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-23 22:05:55.974818 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-23 22:05:55.974833 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.974848 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.974862 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.974877 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.974892 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.974906 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.974927 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.974972 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.974989 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.975003 | orchestrator | 2025-03-23 22:05:55.975017 | orchestrator | TASK [loadbalancer : Copying checks for services which are enabled] ************ 2025-03-23 22:05:55.975106 | orchestrator | Sunday 23 March 2025 21:57:50 +0000 (0:00:04.877) 0:00:42.936 ********** 2025-03-23 22:05:55.975123 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975176 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975193 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975208 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975237 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975252 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.975267 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975281 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.975296 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.975310 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.975325 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.975360 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.975375 | orchestrator | 2025-03-23 22:05:55.975389 | orchestrator | TASK [loadbalancer : Copying over config.json files for services] ************** 2025-03-23 22:05:55.975403 | orchestrator | Sunday 23 March 2025 21:57:55 +0000 (0:00:05.265) 0:00:48.201 ********** 2025-03-23 22:05:55.975418 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975432 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975447 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975461 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975481 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975508 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.975523 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.975538 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.975553 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.975586 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.975608 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.975628 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.975656 | orchestrator | 2025-03-23 22:05:55.975671 | orchestrator | TASK [loadbalancer : Copying over haproxy.cfg] ********************************* 2025-03-23 22:05:55.975685 | orchestrator | Sunday 23 March 2025 21:57:59 +0000 (0:00:04.277) 0:00:52.478 ********** 2025-03-23 22:05:55.975770 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-03-23 22:05:55.975817 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-03-23 22:05:55.975834 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-03-23 22:05:55.975848 | orchestrator | 2025-03-23 22:05:55.975862 | orchestrator | TASK [loadbalancer : Copying over proxysql config] ***************************** 2025-03-23 22:05:55.975964 | orchestrator | Sunday 23 March 2025 21:58:03 +0000 (0:00:04.409) 0:00:56.887 ********** 2025-03-23 22:05:55.975979 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-03-23 22:05:55.975994 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-03-23 22:05:55.976009 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-03-23 22:05:55.976023 | orchestrator | 2025-03-23 22:05:55.976048 | orchestrator | TASK [loadbalancer : Copying over haproxy single external frontend config] ***** 2025-03-23 22:05:55.976063 | orchestrator | Sunday 23 March 2025 21:58:10 +0000 (0:00:06.054) 0:01:02.942 ********** 2025-03-23 22:05:55.976077 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.976091 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.976106 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.976120 | orchestrator | 2025-03-23 22:05:55.976134 | orchestrator | TASK [loadbalancer : Copying over custom haproxy services configuration] ******* 2025-03-23 22:05:55.976148 | orchestrator | Sunday 23 March 2025 21:58:11 +0000 (0:00:01.543) 0:01:04.486 ********** 2025-03-23 22:05:55.976162 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-03-23 22:05:55.976188 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-03-23 22:05:55.976203 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-03-23 22:05:55.976217 | orchestrator | 2025-03-23 22:05:55.976241 | orchestrator | TASK [loadbalancer : Copying over keepalived.conf] ***************************** 2025-03-23 22:05:55.976256 | orchestrator | Sunday 23 March 2025 21:58:16 +0000 (0:00:05.011) 0:01:09.497 ********** 2025-03-23 22:05:55.976282 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-03-23 22:05:55.976297 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-03-23 22:05:55.976311 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-03-23 22:05:55.976336 | orchestrator | 2025-03-23 22:05:55.976351 | orchestrator | TASK [loadbalancer : Copying over haproxy.pem] ********************************* 2025-03-23 22:05:55.976384 | orchestrator | Sunday 23 March 2025 21:58:19 +0000 (0:00:03.172) 0:01:12.669 ********** 2025-03-23 22:05:55.976399 | orchestrator | changed: [testbed-node-1] => (item=haproxy.pem) 2025-03-23 22:05:55.976418 | orchestrator | changed: [testbed-node-0] => (item=haproxy.pem) 2025-03-23 22:05:55.976433 | orchestrator | changed: [testbed-node-2] => (item=haproxy.pem) 2025-03-23 22:05:55.976448 | orchestrator | 2025-03-23 22:05:55.976473 | orchestrator | TASK [loadbalancer : Copying over haproxy-internal.pem] ************************ 2025-03-23 22:05:55.976488 | orchestrator | Sunday 23 March 2025 21:58:22 +0000 (0:00:02.366) 0:01:15.035 ********** 2025-03-23 22:05:55.976503 | orchestrator | changed: [testbed-node-0] => (item=haproxy-internal.pem) 2025-03-23 22:05:55.976527 | orchestrator | changed: [testbed-node-1] => (item=haproxy-internal.pem) 2025-03-23 22:05:55.976575 | orchestrator | changed: [testbed-node-2] => (item=haproxy-internal.pem) 2025-03-23 22:05:55.976592 | orchestrator | 2025-03-23 22:05:55.976611 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2025-03-23 22:05:55.976625 | orchestrator | Sunday 23 March 2025 21:58:24 +0000 (0:00:02.182) 0:01:17.218 ********** 2025-03-23 22:05:55.976640 | orchestrator | included: /ansible/roles/loadbalancer/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.976654 | orchestrator | 2025-03-23 22:05:55.976667 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over extra CA certificates] *** 2025-03-23 22:05:55.976681 | orchestrator | Sunday 23 March 2025 21:58:25 +0000 (0:00:00.969) 0:01:18.188 ********** 2025-03-23 22:05:55.976696 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.976720 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.976735 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.976763 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.976779 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.976798 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.976813 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.976839 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.976855 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.976869 | orchestrator | 2025-03-23 22:05:55.976884 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over backend internal TLS certificate] *** 2025-03-23 22:05:55.976898 | orchestrator | Sunday 23 March 2025 21:58:29 +0000 (0:00:03.816) 0:01:22.004 ********** 2025-03-23 22:05:55.976913 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-03-23 22:05:55.976936 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-23 22:05:55.976955 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.976970 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.976985 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-03-23 22:05:55.976999 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-23 22:05:55.977021 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.977036 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.977050 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-03-23 22:05:55.977071 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-23 22:05:55.977086 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.977101 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.977115 | orchestrator | 2025-03-23 22:05:55.977129 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over backend internal TLS key] *** 2025-03-23 22:05:55.977143 | orchestrator | Sunday 23 March 2025 21:58:30 +0000 (0:00:01.272) 0:01:23.277 ********** 2025-03-23 22:05:55.977162 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-03-23 22:05:55.977177 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-23 22:05:55.977197 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.977212 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.977227 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-03-23 22:05:55.977248 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-23 22:05:55.977262 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.977277 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.977295 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-03-23 22:05:55.977311 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-23 22:05:55.977325 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-23 22:05:55.977340 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.977354 | orchestrator | 2025-03-23 22:05:55.977368 | orchestrator | TASK [loadbalancer : Copying over haproxy start script] ************************ 2025-03-23 22:05:55.977387 | orchestrator | Sunday 23 March 2025 21:58:32 +0000 (0:00:01.955) 0:01:25.232 ********** 2025-03-23 22:05:55.977408 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-03-23 22:05:55.977423 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-03-23 22:05:55.977437 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-03-23 22:05:55.977452 | orchestrator | 2025-03-23 22:05:55.977466 | orchestrator | TASK [loadbalancer : Copying over proxysql start script] *********************** 2025-03-23 22:05:55.977480 | orchestrator | Sunday 23 March 2025 21:58:36 +0000 (0:00:03.862) 0:01:29.094 ********** 2025-03-23 22:05:55.977494 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-03-23 22:05:55.977508 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-03-23 22:05:55.977523 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-03-23 22:05:55.977537 | orchestrator | 2025-03-23 22:05:55.977551 | orchestrator | TASK [loadbalancer : Copying files for haproxy-ssh] **************************** 2025-03-23 22:05:55.977621 | orchestrator | Sunday 23 March 2025 21:58:38 +0000 (0:00:02.039) 0:01:31.134 ********** 2025-03-23 22:05:55.977636 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-03-23 22:05:55.977651 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-03-23 22:05:55.977665 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-03-23 22:05:55.977679 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-03-23 22:05:55.977693 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.977707 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-03-23 22:05:55.977721 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.977737 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-03-23 22:05:55.977750 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.977763 | orchestrator | 2025-03-23 22:05:55.977775 | orchestrator | TASK [loadbalancer : Check loadbalancer containers] **************************** 2025-03-23 22:05:55.977788 | orchestrator | Sunday 23 March 2025 21:58:40 +0000 (0:00:02.055) 0:01:33.189 ********** 2025-03-23 22:05:55.977801 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.977814 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.977827 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-03-23 22:05:55.977860 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.977874 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.977888 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-23 22:05:55.977900 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.977913 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.977926 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.977952 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.977965 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-23 22:05:55.977978 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421', '__omit_place_holder__0af645f6b0d649ef4d296a6df82609ef825c7421'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-23 22:05:55.977991 | orchestrator | 2025-03-23 22:05:55.978004 | orchestrator | TASK [include_role : aodh] ***************************************************** 2025-03-23 22:05:55.978045 | orchestrator | Sunday 23 March 2025 21:58:43 +0000 (0:00:03.486) 0:01:36.675 ********** 2025-03-23 22:05:55.978061 | orchestrator | included: aodh for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.978074 | orchestrator | 2025-03-23 22:05:55.978087 | orchestrator | TASK [haproxy-config : Copying over aodh haproxy config] *********************** 2025-03-23 22:05:55.978099 | orchestrator | Sunday 23 March 2025 21:58:45 +0000 (0:00:01.276) 0:01:37.952 ********** 2025-03-23 22:05:55.978112 | orchestrator | changed: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-03-23 22:05:55.978127 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.978165 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.978187 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.978202 | orchestrator | changed: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-03-23 22:05:55.978215 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.978228 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.978241 | orchestrator | changed: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-03-23 22:05:55.978268 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.978289 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.978302 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.978315 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.978328 | orchestrator | 2025-03-23 22:05:55.978340 | orchestrator | TASK [haproxy-config : Add configuration for aodh when using single external frontend] *** 2025-03-23 22:05:55.978357 | orchestrator | Sunday 23 March 2025 21:58:51 +0000 (0:00:06.412) 0:01:44.365 ********** 2025-03-23 22:05:55.978371 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-03-23 22:05:55.978389 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.978411 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.978430 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.979084 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.979142 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-03-23 22:05:55.979159 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.979172 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.979779 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.979815 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.979829 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-03-23 22:05:55.979843 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.979867 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.979881 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.979894 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.979907 | orchestrator | 2025-03-23 22:05:55.979920 | orchestrator | TASK [haproxy-config : Configuring firewall for aodh] ************************** 2025-03-23 22:05:55.979932 | orchestrator | Sunday 23 March 2025 21:58:52 +0000 (0:00:01.169) 0:01:45.534 ********** 2025-03-23 22:05:55.979945 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-03-23 22:05:55.979959 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-03-23 22:05:55.979983 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.979996 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-03-23 22:05:55.980009 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-03-23 22:05:55.980021 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.980040 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-03-23 22:05:55.980053 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-03-23 22:05:55.980066 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.980079 | orchestrator | 2025-03-23 22:05:55.980091 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL users config] *************** 2025-03-23 22:05:55.980103 | orchestrator | Sunday 23 March 2025 21:58:55 +0000 (0:00:02.490) 0:01:48.024 ********** 2025-03-23 22:05:55.980116 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.980128 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.980140 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.980173 | orchestrator | 2025-03-23 22:05:55.980186 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL rules config] *************** 2025-03-23 22:05:55.980198 | orchestrator | Sunday 23 March 2025 21:58:56 +0000 (0:00:01.535) 0:01:49.559 ********** 2025-03-23 22:05:55.980210 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.980223 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.980246 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.980260 | orchestrator | 2025-03-23 22:05:55.980272 | orchestrator | TASK [include_role : barbican] ************************************************* 2025-03-23 22:05:55.980307 | orchestrator | Sunday 23 March 2025 21:59:00 +0000 (0:00:03.549) 0:01:53.109 ********** 2025-03-23 22:05:55.980320 | orchestrator | included: barbican for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.980333 | orchestrator | 2025-03-23 22:05:55.980345 | orchestrator | TASK [haproxy-config : Copying over barbican haproxy config] ******************* 2025-03-23 22:05:55.980358 | orchestrator | Sunday 23 March 2025 21:59:01 +0000 (0:00:01.319) 0:01:54.429 ********** 2025-03-23 22:05:55.980392 | orchestrator | changed: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.980452 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980473 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980487 | orchestrator | changed: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.980500 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980513 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980543 | orchestrator | changed: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.980640 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980654 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980667 | orchestrator | 2025-03-23 22:05:55.980680 | orchestrator | TASK [haproxy-config : Add configuration for barbican when using single external frontend] *** 2025-03-23 22:05:55.980693 | orchestrator | Sunday 23 March 2025 21:59:07 +0000 (0:00:05.667) 0:02:00.097 ********** 2025-03-23 22:05:55.980706 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.980730 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980751 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980771 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.980784 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.980798 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980810 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980823 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.980844 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.980865 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980885 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.980898 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.980911 | orchestrator | 2025-03-23 22:05:55.980923 | orchestrator | TASK [haproxy-config : Configuring firewall for barbican] ********************** 2025-03-23 22:05:55.980936 | orchestrator | Sunday 23 March 2025 21:59:08 +0000 (0:00:01.622) 0:02:01.719 ********** 2025-03-23 22:05:55.980948 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-23 22:05:55.980961 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-23 22:05:55.980976 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.980988 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-23 22:05:55.981007 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-23 22:05:55.981017 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.981028 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-23 22:05:55.981039 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-23 22:05:55.981049 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.981059 | orchestrator | 2025-03-23 22:05:55.981069 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL users config] *********** 2025-03-23 22:05:55.981079 | orchestrator | Sunday 23 March 2025 21:59:10 +0000 (0:00:01.470) 0:02:03.190 ********** 2025-03-23 22:05:55.981090 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.981100 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.981110 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.981120 | orchestrator | 2025-03-23 22:05:55.981130 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL rules config] *********** 2025-03-23 22:05:55.981141 | orchestrator | Sunday 23 March 2025 21:59:11 +0000 (0:00:01.476) 0:02:04.667 ********** 2025-03-23 22:05:55.981151 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.981161 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.981171 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.981187 | orchestrator | 2025-03-23 22:05:55.981197 | orchestrator | TASK [include_role : blazar] *************************************************** 2025-03-23 22:05:55.981207 | orchestrator | Sunday 23 March 2025 21:59:14 +0000 (0:00:02.480) 0:02:07.147 ********** 2025-03-23 22:05:55.981218 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.981228 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.981238 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.981249 | orchestrator | 2025-03-23 22:05:55.981259 | orchestrator | TASK [include_role : ceph-rgw] ************************************************* 2025-03-23 22:05:55.981269 | orchestrator | Sunday 23 March 2025 21:59:14 +0000 (0:00:00.390) 0:02:07.537 ********** 2025-03-23 22:05:55.981279 | orchestrator | included: ceph-rgw for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.981289 | orchestrator | 2025-03-23 22:05:55.981300 | orchestrator | TASK [haproxy-config : Copying over ceph-rgw haproxy config] ******************* 2025-03-23 22:05:55.981310 | orchestrator | Sunday 23 March 2025 21:59:15 +0000 (0:00:01.003) 0:02:08.540 ********** 2025-03-23 22:05:55.981325 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-03-23 22:05:55.981343 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-03-23 22:05:55.981355 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-03-23 22:05:55.981366 | orchestrator | 2025-03-23 22:05:55.981376 | orchestrator | TASK [haproxy-config : Add configuration for ceph-rgw when using single external frontend] *** 2025-03-23 22:05:55.981387 | orchestrator | Sunday 23 March 2025 21:59:18 +0000 (0:00:03.258) 0:02:11.799 ********** 2025-03-23 22:05:55.981397 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-03-23 22:05:55.981413 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.981428 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-03-23 22:05:55.981439 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.981456 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-03-23 22:05:55.981468 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.981478 | orchestrator | 2025-03-23 22:05:55.981488 | orchestrator | TASK [haproxy-config : Configuring firewall for ceph-rgw] ********************** 2025-03-23 22:05:55.981499 | orchestrator | Sunday 23 March 2025 21:59:21 +0000 (0:00:02.134) 0:02:13.933 ********** 2025-03-23 22:05:55.981509 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-23 22:05:55.981521 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-23 22:05:55.981532 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.981542 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-23 22:05:55.981575 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-23 22:05:55.981586 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.981596 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-23 22:05:55.981612 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-23 22:05:55.981622 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.981632 | orchestrator | 2025-03-23 22:05:55.981643 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL users config] *********** 2025-03-23 22:05:55.981653 | orchestrator | Sunday 23 March 2025 21:59:23 +0000 (0:00:02.393) 0:02:16.327 ********** 2025-03-23 22:05:55.981668 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.981679 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.981689 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.981699 | orchestrator | 2025-03-23 22:05:55.981710 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL rules config] *********** 2025-03-23 22:05:55.981720 | orchestrator | Sunday 23 March 2025 21:59:24 +0000 (0:00:00.967) 0:02:17.295 ********** 2025-03-23 22:05:55.981731 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.981741 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.981751 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.981761 | orchestrator | 2025-03-23 22:05:55.981771 | orchestrator | TASK [include_role : cinder] *************************************************** 2025-03-23 22:05:55.981781 | orchestrator | Sunday 23 March 2025 21:59:25 +0000 (0:00:01.607) 0:02:18.903 ********** 2025-03-23 22:05:55.981792 | orchestrator | included: cinder for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.981802 | orchestrator | 2025-03-23 22:05:55.981812 | orchestrator | TASK [haproxy-config : Copying over cinder haproxy config] ********************* 2025-03-23 22:05:55.981822 | orchestrator | Sunday 23 March 2025 21:59:26 +0000 (0:00:00.976) 0:02:19.880 ********** 2025-03-23 22:05:55.981833 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.981849 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.981860 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.981871 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.981888 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.981899 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.981910 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.981933 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.981944 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.981960 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.981971 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.981989 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.982005 | orchestrator | 2025-03-23 22:05:55.982755 | orchestrator | TASK [haproxy-config : Add configuration for cinder when using single external frontend] *** 2025-03-23 22:05:55.982798 | orchestrator | Sunday 23 March 2025 21:59:34 +0000 (0:00:07.037) 0:02:26.917 ********** 2025-03-23 22:05:55.982811 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.982824 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.982897 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.982930 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.982943 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.982954 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.982974 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.982986 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.983018 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.983031 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.983042 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.983066 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.983078 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.983090 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.983102 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.984867 | orchestrator | 2025-03-23 22:05:55.984905 | orchestrator | TASK [haproxy-config : Configuring firewall for cinder] ************************ 2025-03-23 22:05:55.984921 | orchestrator | Sunday 23 March 2025 21:59:35 +0000 (0:00:01.920) 0:02:28.838 ********** 2025-03-23 22:05:55.984931 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-23 22:05:55.984940 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-23 22:05:55.984950 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.984959 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-23 22:05:55.984975 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-23 22:05:55.984984 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-23 22:05:55.984993 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.985002 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-23 22:05:55.985041 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.985051 | orchestrator | 2025-03-23 22:05:55.985060 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL users config] ************* 2025-03-23 22:05:55.985078 | orchestrator | Sunday 23 March 2025 21:59:37 +0000 (0:00:01.442) 0:02:30.280 ********** 2025-03-23 22:05:55.985088 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.985096 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.985105 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.985113 | orchestrator | 2025-03-23 22:05:55.985137 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL rules config] ************* 2025-03-23 22:05:55.985167 | orchestrator | Sunday 23 March 2025 21:59:39 +0000 (0:00:01.627) 0:02:31.908 ********** 2025-03-23 22:05:55.985177 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.985186 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.985194 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.985203 | orchestrator | 2025-03-23 22:05:55.985211 | orchestrator | TASK [include_role : cloudkitty] *********************************************** 2025-03-23 22:05:55.985220 | orchestrator | Sunday 23 March 2025 21:59:41 +0000 (0:00:02.722) 0:02:34.630 ********** 2025-03-23 22:05:55.985229 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.985237 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.985245 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.985254 | orchestrator | 2025-03-23 22:05:55.985263 | orchestrator | TASK [include_role : cyborg] *************************************************** 2025-03-23 22:05:55.985271 | orchestrator | Sunday 23 March 2025 21:59:42 +0000 (0:00:00.507) 0:02:35.138 ********** 2025-03-23 22:05:55.985280 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.985288 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.985326 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.985344 | orchestrator | 2025-03-23 22:05:55.985354 | orchestrator | TASK [include_role : designate] ************************************************ 2025-03-23 22:05:55.985363 | orchestrator | Sunday 23 March 2025 21:59:42 +0000 (0:00:00.320) 0:02:35.458 ********** 2025-03-23 22:05:55.985378 | orchestrator | included: designate for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.985387 | orchestrator | 2025-03-23 22:05:55.985395 | orchestrator | TASK [haproxy-config : Copying over designate haproxy config] ****************** 2025-03-23 22:05:55.985404 | orchestrator | Sunday 23 March 2025 21:59:43 +0000 (0:00:01.238) 0:02:36.697 ********** 2025-03-23 22:05:55.985414 | orchestrator | changed: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-03-23 22:05:55.985438 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-23 22:05:55.985467 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985500 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985512 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985522 | orchestrator | changed: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-03-23 22:05:55.985532 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985548 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985618 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-23 22:05:55.985629 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985639 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985649 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985659 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985676 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985694 | orchestrator | changed: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-03-23 22:05:55.985714 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-23 22:05:55.985740 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985750 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985759 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985776 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985786 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985799 | orchestrator | 2025-03-23 22:05:55.985808 | orchestrator | TASK [haproxy-config : Add configuration for designate when using single external frontend] *** 2025-03-23 22:05:55.985817 | orchestrator | Sunday 23 March 2025 21:59:49 +0000 (0:00:05.755) 0:02:42.452 ********** 2025-03-23 22:05:55.985830 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-03-23 22:05:55.985840 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-23 22:05:55.985855 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985864 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985873 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985887 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985901 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985910 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.985919 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-03-23 22:05:55.985928 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-23 22:05:55.985943 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985952 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985966 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985979 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985988 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.985997 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.986012 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-03-23 22:05:55.986072 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-23 22:05:55.986116 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.986132 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.986141 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.986155 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.986165 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.986173 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.986182 | orchestrator | 2025-03-23 22:05:55.986191 | orchestrator | TASK [haproxy-config : Configuring firewall for designate] ********************* 2025-03-23 22:05:55.986200 | orchestrator | Sunday 23 March 2025 21:59:51 +0000 (0:00:01.577) 0:02:44.030 ********** 2025-03-23 22:05:55.986209 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-03-23 22:05:55.986219 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-03-23 22:05:55.986229 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.986238 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-03-23 22:05:55.986247 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-03-23 22:05:55.986265 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-03-23 22:05:55.986274 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.986283 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-03-23 22:05:55.986291 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.986300 | orchestrator | 2025-03-23 22:05:55.986309 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL users config] ********** 2025-03-23 22:05:55.986317 | orchestrator | Sunday 23 March 2025 21:59:52 +0000 (0:00:01.711) 0:02:45.741 ********** 2025-03-23 22:05:55.986326 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.986335 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.986343 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.986352 | orchestrator | 2025-03-23 22:05:55.986361 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL rules config] ********** 2025-03-23 22:05:55.986370 | orchestrator | Sunday 23 March 2025 21:59:54 +0000 (0:00:01.505) 0:02:47.247 ********** 2025-03-23 22:05:55.986378 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.986387 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.986396 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.986404 | orchestrator | 2025-03-23 22:05:55.986413 | orchestrator | TASK [include_role : etcd] ***************************************************** 2025-03-23 22:05:55.986421 | orchestrator | Sunday 23 March 2025 21:59:57 +0000 (0:00:02.801) 0:02:50.049 ********** 2025-03-23 22:05:55.986430 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.986439 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.986447 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.986456 | orchestrator | 2025-03-23 22:05:55.986465 | orchestrator | TASK [include_role : glance] *************************************************** 2025-03-23 22:05:55.986473 | orchestrator | Sunday 23 March 2025 21:59:57 +0000 (0:00:00.435) 0:02:50.485 ********** 2025-03-23 22:05:55.986482 | orchestrator | included: glance for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.986491 | orchestrator | 2025-03-23 22:05:55.986500 | orchestrator | TASK [haproxy-config : Copying over glance haproxy config] ********************* 2025-03-23 22:05:55.986508 | orchestrator | Sunday 23 March 2025 21:59:58 +0000 (0:00:01.231) 0:02:51.716 ********** 2025-03-23 22:05:55.986530 | orchestrator | changed: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-03-23 22:05:55.986551 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.986611 | orchestrator | changed: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-03-23 22:05:55.986634 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.986648 | orchestrator | changed: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-03-23 22:05:55.986664 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.986678 | orchestrator | 2025-03-23 22:05:55.986687 | orchestrator | TASK [haproxy-config : Add configuration for glance when using single external frontend] *** 2025-03-23 22:05:55.986696 | orchestrator | Sunday 23 March 2025 22:00:07 +0000 (0:00:08.305) 0:03:00.021 ********** 2025-03-23 22:05:55.986709 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-03-23 22:05:55.986725 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.986739 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.986748 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-03-23 22:05:55.986768 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.986783 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.986792 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-03-23 22:05:55.986807 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.986826 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.986835 | orchestrator | 2025-03-23 22:05:55.986844 | orchestrator | TASK [haproxy-config : Configuring firewall for glance] ************************ 2025-03-23 22:05:55.986853 | orchestrator | Sunday 23 March 2025 22:00:13 +0000 (0:00:06.732) 0:03:06.754 ********** 2025-03-23 22:05:55.986862 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-23 22:05:55.986872 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-23 22:05:55.986881 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.986889 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-23 22:05:55.986899 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-23 22:05:55.986908 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.986920 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-23 22:05:55.986929 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-23 22:05:55.986941 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.986949 | orchestrator | 2025-03-23 22:05:55.986957 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL users config] ************* 2025-03-23 22:05:55.986971 | orchestrator | Sunday 23 March 2025 22:00:21 +0000 (0:00:07.406) 0:03:14.161 ********** 2025-03-23 22:05:55.986979 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.986987 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.986995 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.987003 | orchestrator | 2025-03-23 22:05:55.987011 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL rules config] ************* 2025-03-23 22:05:55.987019 | orchestrator | Sunday 23 March 2025 22:00:22 +0000 (0:00:01.561) 0:03:15.723 ********** 2025-03-23 22:05:55.987027 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.987035 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.987043 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.987051 | orchestrator | 2025-03-23 22:05:55.987059 | orchestrator | TASK [include_role : gnocchi] ************************************************** 2025-03-23 22:05:55.987067 | orchestrator | Sunday 23 March 2025 22:00:25 +0000 (0:00:02.830) 0:03:18.553 ********** 2025-03-23 22:05:55.987075 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.987110 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.987119 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.987128 | orchestrator | 2025-03-23 22:05:55.987136 | orchestrator | TASK [include_role : grafana] ************************************************** 2025-03-23 22:05:55.987144 | orchestrator | Sunday 23 March 2025 22:00:26 +0000 (0:00:00.551) 0:03:19.105 ********** 2025-03-23 22:05:55.987152 | orchestrator | included: grafana for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.987160 | orchestrator | 2025-03-23 22:05:55.987168 | orchestrator | TASK [haproxy-config : Copying over grafana haproxy config] ******************** 2025-03-23 22:05:55.987176 | orchestrator | Sunday 23 March 2025 22:00:27 +0000 (0:00:01.315) 0:03:20.421 ********** 2025-03-23 22:05:55.987185 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-03-23 22:05:55.987193 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-03-23 22:05:55.987202 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-03-23 22:05:55.987246 | orchestrator | 2025-03-23 22:05:55.987255 | orchestrator | TASK [haproxy-config : Add configuration for grafana when using single external frontend] *** 2025-03-23 22:05:55.987267 | orchestrator | Sunday 23 March 2025 22:00:31 +0000 (0:00:04.327) 0:03:24.748 ********** 2025-03-23 22:05:55.987282 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-03-23 22:05:55.987291 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.987299 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-03-23 22:05:55.987308 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.987316 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-03-23 22:05:55.987332 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.987340 | orchestrator | 2025-03-23 22:05:55.987416 | orchestrator | TASK [haproxy-config : Configuring firewall for grafana] *********************** 2025-03-23 22:05:55.987426 | orchestrator | Sunday 23 March 2025 22:00:32 +0000 (0:00:00.439) 0:03:25.188 ********** 2025-03-23 22:05:55.987435 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-03-23 22:05:55.987446 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-03-23 22:05:55.987455 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.987463 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-03-23 22:05:55.987471 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-03-23 22:05:55.987483 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.987492 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-03-23 22:05:55.987500 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-03-23 22:05:55.987508 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.987516 | orchestrator | 2025-03-23 22:05:55.987524 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL users config] ************ 2025-03-23 22:05:55.987532 | orchestrator | Sunday 23 March 2025 22:00:33 +0000 (0:00:01.303) 0:03:26.491 ********** 2025-03-23 22:05:55.987540 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.987548 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.987570 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.987578 | orchestrator | 2025-03-23 22:05:55.987587 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL rules config] ************ 2025-03-23 22:05:55.987599 | orchestrator | Sunday 23 March 2025 22:00:34 +0000 (0:00:01.313) 0:03:27.805 ********** 2025-03-23 22:05:55.987607 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.987615 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.987623 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.987631 | orchestrator | 2025-03-23 22:05:55.987639 | orchestrator | TASK [include_role : heat] ***************************************************** 2025-03-23 22:05:55.987647 | orchestrator | Sunday 23 March 2025 22:00:37 +0000 (0:00:02.508) 0:03:30.314 ********** 2025-03-23 22:05:55.987655 | orchestrator | included: heat for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.987663 | orchestrator | 2025-03-23 22:05:55.987671 | orchestrator | TASK [haproxy-config : Copying over heat haproxy config] *********************** 2025-03-23 22:05:55.987679 | orchestrator | Sunday 23 March 2025 22:00:38 +0000 (0:00:01.269) 0:03:31.583 ********** 2025-03-23 22:05:55.987687 | orchestrator | changed: [testbed-node-0] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.987696 | orchestrator | changed: [testbed-node-1] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.987705 | orchestrator | changed: [testbed-node-2] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.987734 | orchestrator | changed: [testbed-node-0] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.987743 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.987751 | orchestrator | changed: [testbed-node-1] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.987760 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.987768 | orchestrator | changed: [testbed-node-2] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.987780 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.987789 | orchestrator | 2025-03-23 22:05:55.987797 | orchestrator | TASK [haproxy-config : Add configuration for heat when using single external frontend] *** 2025-03-23 22:05:55.987805 | orchestrator | Sunday 23 March 2025 22:00:46 +0000 (0:00:08.088) 0:03:39.672 ********** 2025-03-23 22:05:55.987818 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.987827 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.987836 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.987850 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.987858 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.987867 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.987878 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.987887 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.987895 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.987904 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.987917 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.987925 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.987933 | orchestrator | 2025-03-23 22:05:55.987941 | orchestrator | TASK [haproxy-config : Configuring firewall for heat] ************************** 2025-03-23 22:05:55.987949 | orchestrator | Sunday 23 March 2025 22:00:48 +0000 (0:00:01.255) 0:03:40.928 ********** 2025-03-23 22:05:55.987958 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-23 22:05:55.987967 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-23 22:05:55.987976 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat_api_cfn', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-23 22:05:55.987984 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat_api_cfn_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-23 22:05:55.987992 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.988000 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-23 22:05:55.988012 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-23 22:05:55.988023 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat_api_cfn', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-23 22:05:55.988031 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat_api_cfn_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-23 22:05:55.988040 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.988048 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-23 22:05:55.988056 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-23 22:05:55.988064 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat_api_cfn', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-23 22:05:55.988076 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat_api_cfn_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-23 22:05:55.988084 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.988093 | orchestrator | 2025-03-23 22:05:55.988101 | orchestrator | TASK [proxysql-config : Copying over heat ProxySQL users config] *************** 2025-03-23 22:05:55.988109 | orchestrator | Sunday 23 March 2025 22:00:49 +0000 (0:00:01.570) 0:03:42.498 ********** 2025-03-23 22:05:55.988117 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.988125 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.988136 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.988144 | orchestrator | 2025-03-23 22:05:55.988152 | orchestrator | TASK [proxysql-config : Copying over heat ProxySQL rules config] *************** 2025-03-23 22:05:55.988160 | orchestrator | Sunday 23 March 2025 22:00:51 +0000 (0:00:01.592) 0:03:44.091 ********** 2025-03-23 22:05:55.988168 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.988176 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.988184 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.988192 | orchestrator | 2025-03-23 22:05:55.988200 | orchestrator | TASK [include_role : horizon] ************************************************** 2025-03-23 22:05:55.988208 | orchestrator | Sunday 23 March 2025 22:00:53 +0000 (0:00:02.641) 0:03:46.733 ********** 2025-03-23 22:05:55.988219 | orchestrator | included: horizon for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.988227 | orchestrator | 2025-03-23 22:05:55.988235 | orchestrator | TASK [haproxy-config : Copying over horizon haproxy config] ******************** 2025-03-23 22:05:55.988243 | orchestrator | Sunday 23 March 2025 22:00:55 +0000 (0:00:01.385) 0:03:48.118 ********** 2025-03-23 22:05:55.988256 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-03-23 22:05:55.988265 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-03-23 22:05:55.988284 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-03-23 22:05:55.988296 | orchestrator | 2025-03-23 22:05:55.988305 | orchestrator | TASK [haproxy-config : Add configuration for horizon when using single external frontend] *** 2025-03-23 22:05:55.988313 | orchestrator | Sunday 23 March 2025 22:01:00 +0000 (0:00:04.812) 0:03:52.930 ********** 2025-03-23 22:05:55.988321 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-03-23 22:05:55.988330 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.988343 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-03-23 22:05:55.988356 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.988365 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-03-23 22:05:55.988373 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.988381 | orchestrator | 2025-03-23 22:05:55.988389 | orchestrator | TASK [haproxy-config : Configuring firewall for horizon] *********************** 2025-03-23 22:05:55.988397 | orchestrator | Sunday 23 March 2025 22:01:01 +0000 (0:00:00.981) 0:03:53.911 ********** 2025-03-23 22:05:55.988409 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-23 22:05:55.988419 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-23 22:05:55.988434 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-23 22:05:55.988444 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-23 22:05:55.988452 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-03-23 22:05:55.988460 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.988472 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-23 22:05:55.988482 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-23 22:05:55.988490 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-23 22:05:55.988498 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-23 22:05:55.988506 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-23 22:05:55.988514 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-23 22:05:55.988522 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-03-23 22:05:55.988531 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.988539 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-23 22:05:55.988550 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-23 22:05:55.988575 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-03-23 22:05:55.988584 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.988592 | orchestrator | 2025-03-23 22:05:55.988600 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL users config] ************ 2025-03-23 22:05:55.988608 | orchestrator | Sunday 23 March 2025 22:01:02 +0000 (0:00:01.493) 0:03:55.405 ********** 2025-03-23 22:05:55.988616 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.988625 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.988632 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.988640 | orchestrator | 2025-03-23 22:05:55.988649 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL rules config] ************ 2025-03-23 22:05:55.988657 | orchestrator | Sunday 23 March 2025 22:01:04 +0000 (0:00:01.586) 0:03:56.992 ********** 2025-03-23 22:05:55.988665 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.988673 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.988681 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.988689 | orchestrator | 2025-03-23 22:05:55.988697 | orchestrator | TASK [include_role : influxdb] ************************************************* 2025-03-23 22:05:55.988705 | orchestrator | Sunday 23 March 2025 22:01:06 +0000 (0:00:02.767) 0:03:59.759 ********** 2025-03-23 22:05:55.988713 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.988721 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.988729 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.988737 | orchestrator | 2025-03-23 22:05:55.988745 | orchestrator | TASK [include_role : ironic] *************************************************** 2025-03-23 22:05:55.988753 | orchestrator | Sunday 23 March 2025 22:01:07 +0000 (0:00:00.541) 0:04:00.301 ********** 2025-03-23 22:05:55.988761 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.988769 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.988777 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.988785 | orchestrator | 2025-03-23 22:05:55.988792 | orchestrator | TASK [include_role : keystone] ************************************************* 2025-03-23 22:05:55.988800 | orchestrator | Sunday 23 March 2025 22:01:07 +0000 (0:00:00.340) 0:04:00.642 ********** 2025-03-23 22:05:55.988808 | orchestrator | included: keystone for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.988817 | orchestrator | 2025-03-23 22:05:55.988825 | orchestrator | TASK [haproxy-config : Copying over keystone haproxy config] ******************* 2025-03-23 22:05:55.988833 | orchestrator | Sunday 23 March 2025 22:01:09 +0000 (0:00:01.430) 0:04:02.073 ********** 2025-03-23 22:05:55.988841 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}}) 2025-03-23 22:05:55.988850 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-23 22:05:55.988863 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-23 22:05:55.988876 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}}) 2025-03-23 22:05:55.988885 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-23 22:05:55.988894 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-23 22:05:55.988903 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}}) 2025-03-23 22:05:55.988916 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-23 22:05:55.988929 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-23 22:05:55.988937 | orchestrator | 2025-03-23 22:05:55.988945 | orchestrator | TASK [haproxy-config : Add configuration for keystone when using single external frontend] *** 2025-03-23 22:05:55.988953 | orchestrator | Sunday 23 March 2025 22:01:14 +0000 (0:00:05.252) 0:04:07.325 ********** 2025-03-23 22:05:55.988962 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}})  2025-03-23 22:05:55.988971 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}})  2025-03-23 22:05:55.988984 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-23 22:05:55.988993 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-23 22:05:55.989005 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-23 22:05:55.989013 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.989021 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-23 22:05:55.989030 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.989038 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}})  2025-03-23 22:05:55.989047 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-23 22:05:55.989060 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-23 22:05:55.989069 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.989077 | orchestrator | 2025-03-23 22:05:55.989085 | orchestrator | TASK [haproxy-config : Configuring firewall for keystone] ********************** 2025-03-23 22:05:55.989093 | orchestrator | Sunday 23 March 2025 22:01:15 +0000 (0:00:01.148) 0:04:08.473 ********** 2025-03-23 22:05:55.989101 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-23 22:05:55.989116 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-23 22:05:55.989125 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.989133 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-23 22:05:55.989142 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-23 22:05:55.989150 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.989158 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-23 22:05:55.989166 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-23 22:05:55.989174 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.989182 | orchestrator | 2025-03-23 22:05:55.989191 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL users config] *********** 2025-03-23 22:05:55.989198 | orchestrator | Sunday 23 March 2025 22:01:16 +0000 (0:00:01.069) 0:04:09.543 ********** 2025-03-23 22:05:55.989206 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.989214 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.989222 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.989231 | orchestrator | 2025-03-23 22:05:55.989239 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL rules config] *********** 2025-03-23 22:05:55.989247 | orchestrator | Sunday 23 March 2025 22:01:18 +0000 (0:00:01.498) 0:04:11.042 ********** 2025-03-23 22:05:55.989260 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.989268 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.989276 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.989284 | orchestrator | 2025-03-23 22:05:55.989292 | orchestrator | TASK [include_role : letsencrypt] ********************************************** 2025-03-23 22:05:55.989300 | orchestrator | Sunday 23 March 2025 22:01:20 +0000 (0:00:02.601) 0:04:13.643 ********** 2025-03-23 22:05:55.989308 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.989316 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.989325 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.989334 | orchestrator | 2025-03-23 22:05:55.989342 | orchestrator | TASK [include_role : magnum] *************************************************** 2025-03-23 22:05:55.989351 | orchestrator | Sunday 23 March 2025 22:01:21 +0000 (0:00:00.352) 0:04:13.996 ********** 2025-03-23 22:05:55.989359 | orchestrator | included: magnum for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.989367 | orchestrator | 2025-03-23 22:05:55.989379 | orchestrator | TASK [haproxy-config : Copying over magnum haproxy config] ********************* 2025-03-23 22:05:55.989387 | orchestrator | Sunday 23 March 2025 22:01:22 +0000 (0:00:01.556) 0:04:15.552 ********** 2025-03-23 22:05:55.989396 | orchestrator | changed: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-03-23 22:05:55.989418 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989428 | orchestrator | changed: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-03-23 22:05:55.989437 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989451 | orchestrator | changed: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-03-23 22:05:55.989460 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989468 | orchestrator | 2025-03-23 22:05:55.989477 | orchestrator | TASK [haproxy-config : Add configuration for magnum when using single external frontend] *** 2025-03-23 22:05:55.989485 | orchestrator | Sunday 23 March 2025 22:01:28 +0000 (0:00:06.170) 0:04:21.723 ********** 2025-03-23 22:05:55.989504 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-03-23 22:05:55.989513 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989526 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.989534 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-03-23 22:05:55.989543 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989551 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.989571 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-03-23 22:05:55.989591 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989600 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.989609 | orchestrator | 2025-03-23 22:05:55.989617 | orchestrator | TASK [haproxy-config : Configuring firewall for magnum] ************************ 2025-03-23 22:05:55.989625 | orchestrator | Sunday 23 March 2025 22:01:30 +0000 (0:00:01.602) 0:04:23.326 ********** 2025-03-23 22:05:55.989637 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-03-23 22:05:55.989645 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-03-23 22:05:55.989657 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.989666 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-03-23 22:05:55.989674 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-03-23 22:05:55.989682 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.989690 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-03-23 22:05:55.989698 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-03-23 22:05:55.989706 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.989714 | orchestrator | 2025-03-23 22:05:55.989722 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL users config] ************* 2025-03-23 22:05:55.989730 | orchestrator | Sunday 23 March 2025 22:01:31 +0000 (0:00:01.386) 0:04:24.712 ********** 2025-03-23 22:05:55.989738 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.989746 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.989754 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.989762 | orchestrator | 2025-03-23 22:05:55.989770 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL rules config] ************* 2025-03-23 22:05:55.989778 | orchestrator | Sunday 23 March 2025 22:01:33 +0000 (0:00:01.581) 0:04:26.294 ********** 2025-03-23 22:05:55.989786 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.989794 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.989802 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.989810 | orchestrator | 2025-03-23 22:05:55.989818 | orchestrator | TASK [include_role : manila] *************************************************** 2025-03-23 22:05:55.989825 | orchestrator | Sunday 23 March 2025 22:01:35 +0000 (0:00:02.536) 0:04:28.831 ********** 2025-03-23 22:05:55.989833 | orchestrator | included: manila for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.989841 | orchestrator | 2025-03-23 22:05:55.989849 | orchestrator | TASK [haproxy-config : Copying over manila haproxy config] ********************* 2025-03-23 22:05:55.989857 | orchestrator | Sunday 23 March 2025 22:01:37 +0000 (0:00:01.265) 0:04:30.097 ********** 2025-03-23 22:05:55.989866 | orchestrator | changed: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-03-23 22:05:55.989878 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989892 | orchestrator | changed: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-03-23 22:05:55.989900 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989909 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989923 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989931 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989950 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989959 | orchestrator | changed: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-03-23 22:05:55.989983 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.989992 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990006 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990124 | orchestrator | 2025-03-23 22:05:55.990139 | orchestrator | TASK [haproxy-config : Add configuration for manila when using single external frontend] *** 2025-03-23 22:05:55.990147 | orchestrator | Sunday 23 March 2025 22:01:42 +0000 (0:00:05.094) 0:04:35.191 ********** 2025-03-23 22:05:55.990156 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-03-23 22:05:55.990181 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990190 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990198 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990206 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.990223 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-03-23 22:05:55.990232 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990245 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990258 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990267 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.990275 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-03-23 22:05:55.990290 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990298 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990307 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.990322 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.990335 | orchestrator | 2025-03-23 22:05:55.990343 | orchestrator | TASK [haproxy-config : Configuring firewall for manila] ************************ 2025-03-23 22:05:55.990352 | orchestrator | Sunday 23 March 2025 22:01:43 +0000 (0:00:01.018) 0:04:36.209 ********** 2025-03-23 22:05:55.990360 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-03-23 22:05:55.990368 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-03-23 22:05:55.990377 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.990386 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-03-23 22:05:55.990397 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-03-23 22:05:55.990406 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.990414 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-03-23 22:05:55.990423 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-03-23 22:05:55.990431 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.990439 | orchestrator | 2025-03-23 22:05:55.990448 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL users config] ************* 2025-03-23 22:05:55.990456 | orchestrator | Sunday 23 March 2025 22:01:44 +0000 (0:00:01.436) 0:04:37.646 ********** 2025-03-23 22:05:55.990464 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.990472 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.990480 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.990488 | orchestrator | 2025-03-23 22:05:55.990496 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL rules config] ************* 2025-03-23 22:05:55.990504 | orchestrator | Sunday 23 March 2025 22:01:46 +0000 (0:00:01.519) 0:04:39.166 ********** 2025-03-23 22:05:55.990512 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.990520 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.990528 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.990536 | orchestrator | 2025-03-23 22:05:55.990545 | orchestrator | TASK [include_role : mariadb] ************************************************** 2025-03-23 22:05:55.990553 | orchestrator | Sunday 23 March 2025 22:01:48 +0000 (0:00:02.484) 0:04:41.651 ********** 2025-03-23 22:05:55.990604 | orchestrator | included: mariadb for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.990612 | orchestrator | 2025-03-23 22:05:55.990620 | orchestrator | TASK [mariadb : Ensure mysql monitor user exist] ******************************* 2025-03-23 22:05:55.990628 | orchestrator | Sunday 23 March 2025 22:01:50 +0000 (0:00:01.551) 0:04:43.202 ********** 2025-03-23 22:05:55.990637 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-03-23 22:05:55.990645 | orchestrator | 2025-03-23 22:05:55.990653 | orchestrator | TASK [haproxy-config : Copying over mariadb haproxy config] ******************** 2025-03-23 22:05:55.990661 | orchestrator | Sunday 23 March 2025 22:01:54 +0000 (0:00:03.740) 0:04:46.943 ********** 2025-03-23 22:05:55.990669 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-23 22:05:55.990695 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-23 22:05:55.990705 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.990713 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-23 22:05:55.990727 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-23 22:05:55.990735 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.990758 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-23 22:05:55.990769 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-23 22:05:55.990778 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.990787 | orchestrator | 2025-03-23 22:05:55.990796 | orchestrator | TASK [haproxy-config : Add configuration for mariadb when using single external frontend] *** 2025-03-23 22:05:55.990805 | orchestrator | Sunday 23 March 2025 22:01:57 +0000 (0:00:03.907) 0:04:50.850 ********** 2025-03-23 22:05:55.990815 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-23 22:05:55.990835 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-23 22:05:55.990844 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.990865 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-23 22:05:55.990886 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-23 22:05:55.990895 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.990904 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-23 22:05:55.990919 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-23 22:05:55.990928 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.990937 | orchestrator | 2025-03-23 22:05:55.990946 | orchestrator | TASK [haproxy-config : Configuring firewall for mariadb] *********************** 2025-03-23 22:05:55.990955 | orchestrator | Sunday 23 March 2025 22:02:01 +0000 (0:00:04.010) 0:04:54.861 ********** 2025-03-23 22:05:55.990964 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-23 22:05:55.990978 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-23 22:05:55.990987 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.990996 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-23 22:05:55.991011 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-23 22:05:55.991021 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.991030 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-23 22:05:55.991043 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-23 22:05:55.991051 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.991059 | orchestrator | 2025-03-23 22:05:55.991066 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL users config] ************ 2025-03-23 22:05:55.991074 | orchestrator | Sunday 23 March 2025 22:02:06 +0000 (0:00:04.663) 0:04:59.524 ********** 2025-03-23 22:05:55.991082 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.991090 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.991098 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.991106 | orchestrator | 2025-03-23 22:05:55.991113 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL rules config] ************ 2025-03-23 22:05:55.991124 | orchestrator | Sunday 23 March 2025 22:02:09 +0000 (0:00:02.916) 0:05:02.441 ********** 2025-03-23 22:05:55.991131 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.991138 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.991145 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.991152 | orchestrator | 2025-03-23 22:05:55.991159 | orchestrator | TASK [include_role : masakari] ************************************************* 2025-03-23 22:05:55.991166 | orchestrator | Sunday 23 March 2025 22:02:11 +0000 (0:00:02.321) 0:05:04.762 ********** 2025-03-23 22:05:55.991173 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.991180 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.991187 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.991194 | orchestrator | 2025-03-23 22:05:55.991201 | orchestrator | TASK [include_role : memcached] ************************************************ 2025-03-23 22:05:55.991208 | orchestrator | Sunday 23 March 2025 22:02:12 +0000 (0:00:00.340) 0:05:05.102 ********** 2025-03-23 22:05:55.991215 | orchestrator | included: memcached for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.991222 | orchestrator | 2025-03-23 22:05:55.991229 | orchestrator | TASK [haproxy-config : Copying over memcached haproxy config] ****************** 2025-03-23 22:05:55.991236 | orchestrator | Sunday 23 March 2025 22:02:13 +0000 (0:00:01.616) 0:05:06.719 ********** 2025-03-23 22:05:55.991243 | orchestrator | changed: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-03-23 22:05:55.991250 | orchestrator | changed: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-03-23 22:05:55.991262 | orchestrator | changed: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-03-23 22:05:55.991270 | orchestrator | 2025-03-23 22:05:55.991277 | orchestrator | TASK [haproxy-config : Add configuration for memcached when using single external frontend] *** 2025-03-23 22:05:55.991288 | orchestrator | Sunday 23 March 2025 22:02:15 +0000 (0:00:02.032) 0:05:08.752 ********** 2025-03-23 22:05:55.991295 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-03-23 22:05:55.991306 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.991314 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-03-23 22:05:55.991321 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.991328 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-03-23 22:05:55.991335 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.991342 | orchestrator | 2025-03-23 22:05:55.991349 | orchestrator | TASK [haproxy-config : Configuring firewall for memcached] ********************* 2025-03-23 22:05:55.991357 | orchestrator | Sunday 23 March 2025 22:02:16 +0000 (0:00:00.640) 0:05:09.392 ********** 2025-03-23 22:05:55.991364 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-03-23 22:05:55.991371 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.991378 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-03-23 22:05:55.991386 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.991393 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-03-23 22:05:55.991400 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.991407 | orchestrator | 2025-03-23 22:05:55.991414 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL users config] ********** 2025-03-23 22:05:55.991421 | orchestrator | Sunday 23 March 2025 22:02:17 +0000 (0:00:00.846) 0:05:10.238 ********** 2025-03-23 22:05:55.991432 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.991439 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.991446 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.991453 | orchestrator | 2025-03-23 22:05:55.991460 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL rules config] ********** 2025-03-23 22:05:55.991467 | orchestrator | Sunday 23 March 2025 22:02:18 +0000 (0:00:00.774) 0:05:11.013 ********** 2025-03-23 22:05:55.991474 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.991481 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.991491 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.991498 | orchestrator | 2025-03-23 22:05:55.991505 | orchestrator | TASK [include_role : mistral] ************************************************** 2025-03-23 22:05:55.991512 | orchestrator | Sunday 23 March 2025 22:02:20 +0000 (0:00:01.951) 0:05:12.965 ********** 2025-03-23 22:05:55.991519 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.991527 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.991534 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.991541 | orchestrator | 2025-03-23 22:05:55.991548 | orchestrator | TASK [include_role : neutron] ************************************************** 2025-03-23 22:05:55.991567 | orchestrator | Sunday 23 March 2025 22:02:20 +0000 (0:00:00.335) 0:05:13.301 ********** 2025-03-23 22:05:55.991575 | orchestrator | included: neutron for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.991582 | orchestrator | 2025-03-23 22:05:55.991589 | orchestrator | TASK [haproxy-config : Copying over neutron haproxy config] ******************** 2025-03-23 22:05:55.991596 | orchestrator | Sunday 23 March 2025 22:02:22 +0000 (0:00:01.660) 0:05:14.961 ********** 2025-03-23 22:05:55.991609 | orchestrator | changed: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-03-23 22:05:55.991617 | orchestrator | changed: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-03-23 22:05:55.991624 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991639 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991647 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991659 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991667 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991675 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-23 22:05:55.991686 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991697 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991709 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-23 22:05:55.991717 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.991725 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991732 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.991744 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.991752 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991762 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.991770 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.991784 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991792 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991799 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.991811 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.991821 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991829 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.991836 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.991844 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991857 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.991868 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.991878 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991886 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.991898 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.991906 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991917 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.991924 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.991935 | orchestrator | changed: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-03-23 22:05:55.992011 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992028 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992054 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992061 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-23 22:05:55.992077 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992085 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992093 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992105 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992117 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.992125 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992132 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.992142 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992150 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992158 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.992174 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.992182 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992189 | orchestrator | 2025-03-23 22:05:55.992196 | orchestrator | TASK [haproxy-config : Add configuration for neutron when using single external frontend] *** 2025-03-23 22:05:55.992203 | orchestrator | Sunday 23 March 2025 22:02:27 +0000 (0:00:05.889) 0:05:20.851 ********** 2025-03-23 22:05:55.992214 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-03-23 22:05:55.992222 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992230 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992247 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992255 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-23 22:05:55.992262 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992272 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992280 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992288 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992305 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.992313 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992321 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.992328 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992338 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992351 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.992362 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.992370 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992377 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.992385 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-03-23 22:05:55.992395 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992407 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992419 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992427 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-23 22:05:55.992434 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992441 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992452 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992459 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992475 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.992483 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992490 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.992497 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992507 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992520 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.992531 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.992538 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992546 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.992553 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-03-23 22:05:55.992573 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992590 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992603 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992626 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-23 22:05:55.992634 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992642 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992651 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992662 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992681 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.992690 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992698 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.992706 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-23 22:05:55.992713 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992725 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-23 22:05:55.992745 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-23 22:05:55.992753 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.992761 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.992769 | orchestrator | 2025-03-23 22:05:55.992777 | orchestrator | TASK [haproxy-config : Configuring firewall for neutron] *********************** 2025-03-23 22:05:55.992785 | orchestrator | Sunday 23 March 2025 22:02:30 +0000 (0:00:02.441) 0:05:23.293 ********** 2025-03-23 22:05:55.992792 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-03-23 22:05:55.992800 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-03-23 22:05:55.992808 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.992818 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-03-23 22:05:55.992826 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-03-23 22:05:55.992834 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.992842 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-03-23 22:05:55.992850 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-03-23 22:05:55.992858 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.992866 | orchestrator | 2025-03-23 22:05:55.992874 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL users config] ************ 2025-03-23 22:05:55.992884 | orchestrator | Sunday 23 March 2025 22:02:32 +0000 (0:00:02.078) 0:05:25.372 ********** 2025-03-23 22:05:55.992897 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.992905 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.992916 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.992924 | orchestrator | 2025-03-23 22:05:55.992931 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL rules config] ************ 2025-03-23 22:05:55.992939 | orchestrator | Sunday 23 March 2025 22:02:34 +0000 (0:00:01.609) 0:05:26.981 ********** 2025-03-23 22:05:55.992947 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.992955 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.992962 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.992970 | orchestrator | 2025-03-23 22:05:55.992978 | orchestrator | TASK [include_role : placement] ************************************************ 2025-03-23 22:05:55.992986 | orchestrator | Sunday 23 March 2025 22:02:36 +0000 (0:00:02.724) 0:05:29.705 ********** 2025-03-23 22:05:55.992996 | orchestrator | included: placement for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.993003 | orchestrator | 2025-03-23 22:05:55.993010 | orchestrator | TASK [haproxy-config : Copying over placement haproxy config] ****************** 2025-03-23 22:05:55.993018 | orchestrator | Sunday 23 March 2025 22:02:38 +0000 (0:00:01.711) 0:05:31.417 ********** 2025-03-23 22:05:55.993025 | orchestrator | changed: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.993032 | orchestrator | changed: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.993045 | orchestrator | changed: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.993056 | orchestrator | 2025-03-23 22:05:55.993063 | orchestrator | TASK [haproxy-config : Add configuration for placement when using single external frontend] *** 2025-03-23 22:05:55.993071 | orchestrator | Sunday 23 March 2025 22:02:43 +0000 (0:00:04.993) 0:05:36.410 ********** 2025-03-23 22:05:55.993078 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.993088 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.993095 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.993103 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.993110 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.993117 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.993124 | orchestrator | 2025-03-23 22:05:55.993131 | orchestrator | TASK [haproxy-config : Configuring firewall for placement] ********************* 2025-03-23 22:05:55.993139 | orchestrator | Sunday 23 March 2025 22:02:44 +0000 (0:00:00.619) 0:05:37.030 ********** 2025-03-23 22:05:55.993146 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993153 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993161 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.993172 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993180 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993188 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.993195 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993202 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993209 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.993216 | orchestrator | 2025-03-23 22:05:55.993224 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL users config] ********** 2025-03-23 22:05:55.993231 | orchestrator | Sunday 23 March 2025 22:02:45 +0000 (0:00:01.708) 0:05:38.738 ********** 2025-03-23 22:05:55.993238 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.993245 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.993252 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.993259 | orchestrator | 2025-03-23 22:05:55.993266 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL rules config] ********** 2025-03-23 22:05:55.993273 | orchestrator | Sunday 23 March 2025 22:02:47 +0000 (0:00:01.520) 0:05:40.258 ********** 2025-03-23 22:05:55.993280 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.993287 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.993296 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.993303 | orchestrator | 2025-03-23 22:05:55.993310 | orchestrator | TASK [include_role : nova] ***************************************************** 2025-03-23 22:05:55.993318 | orchestrator | Sunday 23 March 2025 22:02:49 +0000 (0:00:02.386) 0:05:42.645 ********** 2025-03-23 22:05:55.993325 | orchestrator | included: nova for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.993332 | orchestrator | 2025-03-23 22:05:55.993339 | orchestrator | TASK [haproxy-config : Copying over nova haproxy config] *********************** 2025-03-23 22:05:55.993346 | orchestrator | Sunday 23 March 2025 22:02:51 +0000 (0:00:01.768) 0:05:44.413 ********** 2025-03-23 22:05:55.993358 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.993366 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993378 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993389 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.993402 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993410 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993417 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.993428 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993436 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993443 | orchestrator | 2025-03-23 22:05:55.993450 | orchestrator | TASK [haproxy-config : Add configuration for nova when using single external frontend] *** 2025-03-23 22:05:55.993457 | orchestrator | Sunday 23 March 2025 22:02:57 +0000 (0:00:06.449) 0:05:50.863 ********** 2025-03-23 22:05:55.993473 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.993545 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993597 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993606 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.993613 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.993645 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993655 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993662 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.993673 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.993685 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993693 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.993700 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.993707 | orchestrator | 2025-03-23 22:05:55.993714 | orchestrator | TASK [haproxy-config : Configuring firewall for nova] ************************** 2025-03-23 22:05:55.993721 | orchestrator | Sunday 23 March 2025 22:02:59 +0000 (0:00:01.287) 0:05:52.150 ********** 2025-03-23 22:05:55.993728 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993736 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993743 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993766 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993774 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.993782 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993789 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993796 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993803 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993817 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.993824 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993831 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993839 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993846 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-23 22:05:55.993853 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.993860 | orchestrator | 2025-03-23 22:05:55.993867 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL users config] *************** 2025-03-23 22:05:55.993874 | orchestrator | Sunday 23 March 2025 22:03:01 +0000 (0:00:01.765) 0:05:53.915 ********** 2025-03-23 22:05:55.993881 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.993888 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.993895 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.993902 | orchestrator | 2025-03-23 22:05:55.993909 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL rules config] *************** 2025-03-23 22:05:55.993916 | orchestrator | Sunday 23 March 2025 22:03:02 +0000 (0:00:01.723) 0:05:55.639 ********** 2025-03-23 22:05:55.993923 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.993930 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.993937 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.993944 | orchestrator | 2025-03-23 22:05:55.993951 | orchestrator | TASK [include_role : nova-cell] ************************************************ 2025-03-23 22:05:55.993958 | orchestrator | Sunday 23 March 2025 22:03:05 +0000 (0:00:02.689) 0:05:58.328 ********** 2025-03-23 22:05:55.993965 | orchestrator | included: nova-cell for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.993972 | orchestrator | 2025-03-23 22:05:55.993979 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-novncproxy] ****************** 2025-03-23 22:05:55.993985 | orchestrator | Sunday 23 March 2025 22:03:07 +0000 (0:00:01.928) 0:06:00.256 ********** 2025-03-23 22:05:55.993992 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-novncproxy) 2025-03-23 22:05:55.993998 | orchestrator | 2025-03-23 22:05:55.994007 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config] *** 2025-03-23 22:05:55.994014 | orchestrator | Sunday 23 March 2025 22:03:08 +0000 (0:00:01.507) 0:06:01.764 ********** 2025-03-23 22:05:55.994039 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-03-23 22:05:55.994062 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-03-23 22:05:55.994079 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-03-23 22:05:55.994086 | orchestrator | 2025-03-23 22:05:55.994092 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-novncproxy when using single external frontend] *** 2025-03-23 22:05:55.994099 | orchestrator | Sunday 23 March 2025 22:03:15 +0000 (0:00:06.391) 0:06:08.155 ********** 2025-03-23 22:05:55.994105 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-23 22:05:55.994112 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.994118 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-23 22:05:55.994124 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.994131 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-23 22:05:55.994137 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.994143 | orchestrator | 2025-03-23 22:05:55.994150 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-novncproxy] ***** 2025-03-23 22:05:55.994156 | orchestrator | Sunday 23 March 2025 22:03:17 +0000 (0:00:02.115) 0:06:10.270 ********** 2025-03-23 22:05:55.994162 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-23 22:05:55.994169 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-23 22:05:55.994176 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.994183 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-23 22:05:55.994192 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-23 22:05:55.994204 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.994211 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-23 22:05:55.994233 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-23 22:05:55.994241 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.994248 | orchestrator | 2025-03-23 22:05:55.994255 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-03-23 22:05:55.994262 | orchestrator | Sunday 23 March 2025 22:03:19 +0000 (0:00:02.205) 0:06:12.476 ********** 2025-03-23 22:05:55.994269 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.994276 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.994283 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.994290 | orchestrator | 2025-03-23 22:05:55.994297 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-03-23 22:05:55.994304 | orchestrator | Sunday 23 March 2025 22:03:22 +0000 (0:00:03.277) 0:06:15.753 ********** 2025-03-23 22:05:55.994311 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.994317 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.994324 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.994331 | orchestrator | 2025-03-23 22:05:55.994338 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-spicehtml5proxy] ************* 2025-03-23 22:05:55.994345 | orchestrator | Sunday 23 March 2025 22:03:27 +0000 (0:00:04.224) 0:06:19.978 ********** 2025-03-23 22:05:55.994352 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-spicehtml5proxy) 2025-03-23 22:05:55.994359 | orchestrator | 2025-03-23 22:05:55.994366 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-spicehtml5proxy haproxy config] *** 2025-03-23 22:05:55.994373 | orchestrator | Sunday 23 March 2025 22:03:28 +0000 (0:00:01.468) 0:06:21.446 ********** 2025-03-23 22:05:55.994380 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-23 22:05:55.994387 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.994394 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-23 22:05:55.994401 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.994408 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-23 22:05:55.994419 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.994426 | orchestrator | 2025-03-23 22:05:55.994433 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-spicehtml5proxy when using single external frontend] *** 2025-03-23 22:05:55.994440 | orchestrator | Sunday 23 March 2025 22:03:30 +0000 (0:00:01.906) 0:06:23.352 ********** 2025-03-23 22:05:55.994447 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-23 22:05:55.994454 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.994480 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-23 22:05:55.994488 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.994495 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-23 22:05:55.994503 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.994510 | orchestrator | 2025-03-23 22:05:55.994517 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-spicehtml5proxy] *** 2025-03-23 22:05:55.994524 | orchestrator | Sunday 23 March 2025 22:03:32 +0000 (0:00:01.930) 0:06:25.283 ********** 2025-03-23 22:05:55.994531 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.994537 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.994543 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.994549 | orchestrator | 2025-03-23 22:05:55.994566 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-03-23 22:05:55.994572 | orchestrator | Sunday 23 March 2025 22:03:34 +0000 (0:00:02.329) 0:06:27.613 ********** 2025-03-23 22:05:55.994579 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.994585 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.994594 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.994600 | orchestrator | 2025-03-23 22:05:55.994606 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-03-23 22:05:55.994613 | orchestrator | Sunday 23 March 2025 22:03:37 +0000 (0:00:02.781) 0:06:30.394 ********** 2025-03-23 22:05:55.994619 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.994625 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.994631 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.994637 | orchestrator | 2025-03-23 22:05:55.994644 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-serialproxy] ***************** 2025-03-23 22:05:55.994650 | orchestrator | Sunday 23 March 2025 22:03:41 +0000 (0:00:04.230) 0:06:34.624 ********** 2025-03-23 22:05:55.994656 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-serialproxy) 2025-03-23 22:05:55.994667 | orchestrator | 2025-03-23 22:05:55.994673 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-serialproxy haproxy config] *** 2025-03-23 22:05:55.994679 | orchestrator | Sunday 23 March 2025 22:03:43 +0000 (0:00:01.637) 0:06:36.262 ********** 2025-03-23 22:05:55.994686 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-23 22:05:55.994692 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.994699 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-23 22:05:55.994705 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.994712 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-23 22:05:55.994718 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.994724 | orchestrator | 2025-03-23 22:05:55.994731 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-serialproxy when using single external frontend] *** 2025-03-23 22:05:55.994737 | orchestrator | Sunday 23 March 2025 22:03:45 +0000 (0:00:02.380) 0:06:38.643 ********** 2025-03-23 22:05:55.994758 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-23 22:05:55.994766 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.994773 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-23 22:05:55.994779 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.994790 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-23 22:05:55.994802 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.994808 | orchestrator | 2025-03-23 22:05:55.994815 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-serialproxy] **** 2025-03-23 22:05:55.994821 | orchestrator | Sunday 23 March 2025 22:03:47 +0000 (0:00:01.546) 0:06:40.189 ********** 2025-03-23 22:05:55.994827 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.994833 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.994839 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.994846 | orchestrator | 2025-03-23 22:05:55.994852 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-03-23 22:05:55.994858 | orchestrator | Sunday 23 March 2025 22:03:49 +0000 (0:00:02.207) 0:06:42.397 ********** 2025-03-23 22:05:55.994864 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.994871 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.994877 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.994883 | orchestrator | 2025-03-23 22:05:55.994889 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-03-23 22:05:55.994896 | orchestrator | Sunday 23 March 2025 22:03:52 +0000 (0:00:02.952) 0:06:45.350 ********** 2025-03-23 22:05:55.994902 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.994908 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.994914 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.994921 | orchestrator | 2025-03-23 22:05:55.994927 | orchestrator | TASK [include_role : octavia] ************************************************** 2025-03-23 22:05:55.994936 | orchestrator | Sunday 23 March 2025 22:03:56 +0000 (0:00:04.135) 0:06:49.485 ********** 2025-03-23 22:05:55.994942 | orchestrator | included: octavia for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.994948 | orchestrator | 2025-03-23 22:05:55.994955 | orchestrator | TASK [haproxy-config : Copying over octavia haproxy config] ******************** 2025-03-23 22:05:55.994961 | orchestrator | Sunday 23 March 2025 22:03:58 +0000 (0:00:01.849) 0:06:51.335 ********** 2025-03-23 22:05:55.994967 | orchestrator | changed: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.994988 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-23 22:05:55.994995 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995008 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995019 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.995026 | orchestrator | changed: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.995033 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-23 22:05:55.995057 | orchestrator | changed: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.995071 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995077 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995084 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-23 22:05:55.995090 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.995097 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995104 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995124 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.995136 | orchestrator | 2025-03-23 22:05:55.995142 | orchestrator | TASK [haproxy-config : Add configuration for octavia when using single external frontend] *** 2025-03-23 22:05:55.995149 | orchestrator | Sunday 23 March 2025 22:04:03 +0000 (0:00:05.004) 0:06:56.339 ********** 2025-03-23 22:05:55.995160 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.995167 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-23 22:05:55.995174 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995180 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995187 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.995197 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.995222 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.995230 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-23 22:05:55.995237 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995243 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995314 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.995336 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.995347 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.995354 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-23 22:05:55.995361 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995367 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-23 22:05:55.995374 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-23 22:05:55.995380 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.995387 | orchestrator | 2025-03-23 22:05:55.995393 | orchestrator | TASK [haproxy-config : Configuring firewall for octavia] *********************** 2025-03-23 22:05:55.995399 | orchestrator | Sunday 23 March 2025 22:04:04 +0000 (0:00:01.207) 0:06:57.548 ********** 2025-03-23 22:05:55.995406 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-23 22:05:55.995412 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-23 22:05:55.995419 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.995426 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-23 22:05:55.995435 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-23 22:05:55.995442 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.995448 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-23 22:05:55.995455 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-23 22:05:55.995474 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.995481 | orchestrator | 2025-03-23 22:05:55.995487 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL users config] ************ 2025-03-23 22:05:55.995494 | orchestrator | Sunday 23 March 2025 22:04:06 +0000 (0:00:01.552) 0:06:59.100 ********** 2025-03-23 22:05:55.995500 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.995506 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.995513 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.995519 | orchestrator | 2025-03-23 22:05:55.995525 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL rules config] ************ 2025-03-23 22:05:55.995532 | orchestrator | Sunday 23 March 2025 22:04:07 +0000 (0:00:01.680) 0:07:00.780 ********** 2025-03-23 22:05:55.995538 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.995544 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.995550 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.995567 | orchestrator | 2025-03-23 22:05:55.995574 | orchestrator | TASK [include_role : opensearch] *********************************************** 2025-03-23 22:05:55.995581 | orchestrator | Sunday 23 March 2025 22:04:10 +0000 (0:00:02.709) 0:07:03.489 ********** 2025-03-23 22:05:55.995587 | orchestrator | included: opensearch for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.995593 | orchestrator | 2025-03-23 22:05:55.995599 | orchestrator | TASK [haproxy-config : Copying over opensearch haproxy config] ***************** 2025-03-23 22:05:55.995605 | orchestrator | Sunday 23 March 2025 22:04:12 +0000 (0:00:01.939) 0:07:05.429 ********** 2025-03-23 22:05:55.995612 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-03-23 22:05:55.995619 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-03-23 22:05:55.995630 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-03-23 22:05:55.995651 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-03-23 22:05:55.995660 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-03-23 22:05:55.995667 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-03-23 22:05:55.995677 | orchestrator | 2025-03-23 22:05:55.995684 | orchestrator | TASK [haproxy-config : Add configuration for opensearch when using single external frontend] *** 2025-03-23 22:05:55.995690 | orchestrator | Sunday 23 March 2025 22:04:19 +0000 (0:00:07.139) 0:07:12.568 ********** 2025-03-23 22:05:55.995696 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-03-23 22:05:55.995717 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-03-23 22:05:55.995724 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.995731 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-03-23 22:05:55.995738 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-03-23 22:05:55.995748 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.995755 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-03-23 22:05:55.995775 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-03-23 22:05:55.995783 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.995789 | orchestrator | 2025-03-23 22:05:55.995796 | orchestrator | TASK [haproxy-config : Configuring firewall for opensearch] ******************** 2025-03-23 22:05:55.995802 | orchestrator | Sunday 23 March 2025 22:04:20 +0000 (0:00:00.960) 0:07:13.528 ********** 2025-03-23 22:05:55.995808 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-03-23 22:05:55.995815 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-23 22:05:55.995821 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-23 22:05:55.995828 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.995834 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-03-23 22:05:55.995840 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-23 22:05:55.995847 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-23 22:05:55.995857 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.995866 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-03-23 22:05:55.995872 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-23 22:05:55.995879 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-23 22:05:55.995885 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.995892 | orchestrator | 2025-03-23 22:05:55.995898 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL users config] ********* 2025-03-23 22:05:55.995904 | orchestrator | Sunday 23 March 2025 22:04:22 +0000 (0:00:01.469) 0:07:14.998 ********** 2025-03-23 22:05:55.995911 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.995917 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.995923 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.995930 | orchestrator | 2025-03-23 22:05:55.995936 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL rules config] ********* 2025-03-23 22:05:55.995942 | orchestrator | Sunday 23 March 2025 22:04:22 +0000 (0:00:00.462) 0:07:15.460 ********** 2025-03-23 22:05:55.995948 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.995955 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.995961 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.995967 | orchestrator | 2025-03-23 22:05:55.995973 | orchestrator | TASK [include_role : prometheus] *********************************************** 2025-03-23 22:05:55.995980 | orchestrator | Sunday 23 March 2025 22:04:24 +0000 (0:00:01.928) 0:07:17.388 ********** 2025-03-23 22:05:55.995986 | orchestrator | included: prometheus for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.995992 | orchestrator | 2025-03-23 22:05:55.995998 | orchestrator | TASK [haproxy-config : Copying over prometheus haproxy config] ***************** 2025-03-23 22:05:55.996005 | orchestrator | Sunday 23 March 2025 22:04:26 +0000 (0:00:02.051) 0:07:19.440 ********** 2025-03-23 22:05:55.996025 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-03-23 22:05:55.996033 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-23 22:05:55.996044 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996050 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996057 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996064 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-03-23 22:05:55.996070 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-23 22:05:55.996090 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996098 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996108 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996115 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-03-23 22:05:55.996121 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-23 22:05:55.996128 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996134 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996154 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996161 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-03-23 22:05:55.996173 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-23 22:05:55.996180 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996187 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996193 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996212 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996220 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-03-23 22:05:55.996230 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-23 22:05:55.996237 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996244 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996250 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996259 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996266 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-03-23 22:05:55.996276 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-23 22:05:55.996282 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996289 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996295 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996302 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996308 | orchestrator | 2025-03-23 22:05:55.996318 | orchestrator | TASK [haproxy-config : Add configuration for prometheus when using single external frontend] *** 2025-03-23 22:05:55.996324 | orchestrator | Sunday 23 March 2025 22:04:32 +0000 (0:00:05.693) 0:07:25.133 ********** 2025-03-23 22:05:55.996334 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-03-23 22:05:55.996341 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-23 22:05:55.996347 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996354 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996360 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996367 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-03-23 22:05:55.996380 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-23 22:05:55.996387 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996393 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996400 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996406 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996413 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-03-23 22:05:55.996419 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.996431 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-23 22:05:55.996438 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996445 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-03-23 22:05:55.996451 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996458 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996465 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-23 22:05:55.996473 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-03-23 22:05:55.996485 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996491 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-23 22:05:55.996498 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996505 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996511 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996518 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-03-23 22:05:55.996531 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996538 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-23 22:05:55.996545 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996551 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996590 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996597 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.996604 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996621 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-23 22:05:55.996634 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-23 22:05:55.996641 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.996647 | orchestrator | 2025-03-23 22:05:55.996654 | orchestrator | TASK [haproxy-config : Configuring firewall for prometheus] ******************** 2025-03-23 22:05:55.996660 | orchestrator | Sunday 23 March 2025 22:04:33 +0000 (0:00:01.381) 0:07:26.515 ********** 2025-03-23 22:05:55.996667 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-03-23 22:05:55.996673 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-03-23 22:05:55.996680 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-23 22:05:55.996687 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-23 22:05:55.996693 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.996700 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-03-23 22:05:55.996707 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-03-23 22:05:55.996714 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-23 22:05:55.996720 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-23 22:05:55.996730 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.996737 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-03-23 22:05:55.996746 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-03-23 22:05:55.996752 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-23 22:05:55.996759 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-23 22:05:55.996766 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.996772 | orchestrator | 2025-03-23 22:05:55.996778 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL users config] ********* 2025-03-23 22:05:55.996785 | orchestrator | Sunday 23 March 2025 22:04:35 +0000 (0:00:02.068) 0:07:28.584 ********** 2025-03-23 22:05:55.996793 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.996800 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.996809 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.996815 | orchestrator | 2025-03-23 22:05:55.996821 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL rules config] ********* 2025-03-23 22:05:55.996828 | orchestrator | Sunday 23 March 2025 22:04:36 +0000 (0:00:00.821) 0:07:29.405 ********** 2025-03-23 22:05:55.996834 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.996840 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.996847 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.996853 | orchestrator | 2025-03-23 22:05:55.996859 | orchestrator | TASK [include_role : rabbitmq] ************************************************* 2025-03-23 22:05:55.996865 | orchestrator | Sunday 23 March 2025 22:04:38 +0000 (0:00:02.056) 0:07:31.461 ********** 2025-03-23 22:05:55.996872 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.996878 | orchestrator | 2025-03-23 22:05:55.996884 | orchestrator | TASK [haproxy-config : Copying over rabbitmq haproxy config] ******************* 2025-03-23 22:05:55.996891 | orchestrator | Sunday 23 March 2025 22:04:40 +0000 (0:00:01.986) 0:07:33.448 ********** 2025-03-23 22:05:55.996897 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:05:55.996904 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:05:55.996917 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-23 22:05:55.996928 | orchestrator | 2025-03-23 22:05:55.996935 | orchestrator | TASK [haproxy-config : Add configuration for rabbitmq when using single external frontend] *** 2025-03-23 22:05:55.996941 | orchestrator | Sunday 23 March 2025 22:04:43 +0000 (0:00:03.327) 0:07:36.776 ********** 2025-03-23 22:05:55.996950 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-03-23 22:05:55.996957 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.996964 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-03-23 22:05:55.996974 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.996980 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-03-23 22:05:55.996991 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.996998 | orchestrator | 2025-03-23 22:05:55.997004 | orchestrator | TASK [haproxy-config : Configuring firewall for rabbitmq] ********************** 2025-03-23 22:05:55.997011 | orchestrator | Sunday 23 March 2025 22:04:44 +0000 (0:00:00.508) 0:07:37.284 ********** 2025-03-23 22:05:55.997017 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-03-23 22:05:55.997023 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997030 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-03-23 22:05:55.997036 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997042 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-03-23 22:05:55.997049 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997055 | orchestrator | 2025-03-23 22:05:55.997061 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL users config] *********** 2025-03-23 22:05:55.997068 | orchestrator | Sunday 23 March 2025 22:04:45 +0000 (0:00:01.331) 0:07:38.615 ********** 2025-03-23 22:05:55.997074 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997080 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997087 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997093 | orchestrator | 2025-03-23 22:05:55.997099 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL rules config] *********** 2025-03-23 22:05:55.997107 | orchestrator | Sunday 23 March 2025 22:04:46 +0000 (0:00:00.459) 0:07:39.075 ********** 2025-03-23 22:05:55.997113 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997119 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997125 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997131 | orchestrator | 2025-03-23 22:05:55.997137 | orchestrator | TASK [include_role : skyline] ************************************************** 2025-03-23 22:05:55.997143 | orchestrator | Sunday 23 March 2025 22:04:48 +0000 (0:00:01.915) 0:07:40.991 ********** 2025-03-23 22:05:55.997149 | orchestrator | included: skyline for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-23 22:05:55.997155 | orchestrator | 2025-03-23 22:05:55.997161 | orchestrator | TASK [haproxy-config : Copying over skyline haproxy config] ******************** 2025-03-23 22:05:55.997167 | orchestrator | Sunday 23 March 2025 22:04:50 +0000 (0:00:02.133) 0:07:43.124 ********** 2025-03-23 22:05:55.997173 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.997184 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.997190 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.997199 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.997206 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.997216 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-03-23 22:05:55.997222 | orchestrator | 2025-03-23 22:05:55.997228 | orchestrator | TASK [haproxy-config : Add configuration for skyline when using single external frontend] *** 2025-03-23 22:05:55.997234 | orchestrator | Sunday 23 March 2025 22:04:59 +0000 (0:00:08.812) 0:07:51.937 ********** 2025-03-23 22:05:55.997244 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.997253 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.997259 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997265 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.997275 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.997281 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997291 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.997298 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-03-23 22:05:55.997304 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997310 | orchestrator | 2025-03-23 22:05:55.997316 | orchestrator | TASK [haproxy-config : Configuring firewall for skyline] *********************** 2025-03-23 22:05:55.997324 | orchestrator | Sunday 23 March 2025 22:05:00 +0000 (0:00:01.424) 0:07:53.362 ********** 2025-03-23 22:05:55.997330 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997336 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997346 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997352 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997358 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997364 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997370 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997376 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997382 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997388 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997394 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997400 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997406 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997412 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-23 22:05:55.997418 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997424 | orchestrator | 2025-03-23 22:05:55.997430 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL users config] ************ 2025-03-23 22:05:55.997436 | orchestrator | Sunday 23 March 2025 22:05:02 +0000 (0:00:01.690) 0:07:55.052 ********** 2025-03-23 22:05:55.997442 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.997448 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.997454 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.997460 | orchestrator | 2025-03-23 22:05:55.997466 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL rules config] ************ 2025-03-23 22:05:55.997472 | orchestrator | Sunday 23 March 2025 22:05:03 +0000 (0:00:01.637) 0:07:56.690 ********** 2025-03-23 22:05:55.997478 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.997484 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.997490 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.997495 | orchestrator | 2025-03-23 22:05:55.997501 | orchestrator | TASK [include_role : swift] **************************************************** 2025-03-23 22:05:55.997510 | orchestrator | Sunday 23 March 2025 22:05:06 +0000 (0:00:02.851) 0:07:59.541 ********** 2025-03-23 22:05:55.997519 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997525 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997531 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997537 | orchestrator | 2025-03-23 22:05:55.997543 | orchestrator | TASK [include_role : tacker] *************************************************** 2025-03-23 22:05:55.997549 | orchestrator | Sunday 23 March 2025 22:05:07 +0000 (0:00:00.697) 0:08:00.238 ********** 2025-03-23 22:05:55.997566 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997573 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997579 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997585 | orchestrator | 2025-03-23 22:05:55.997591 | orchestrator | TASK [include_role : trove] **************************************************** 2025-03-23 22:05:55.997599 | orchestrator | Sunday 23 March 2025 22:05:07 +0000 (0:00:00.410) 0:08:00.649 ********** 2025-03-23 22:05:55.997606 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997611 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997617 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997623 | orchestrator | 2025-03-23 22:05:55.997629 | orchestrator | TASK [include_role : venus] **************************************************** 2025-03-23 22:05:55.997635 | orchestrator | Sunday 23 March 2025 22:05:08 +0000 (0:00:00.670) 0:08:01.320 ********** 2025-03-23 22:05:55.997641 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997647 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997653 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997659 | orchestrator | 2025-03-23 22:05:55.997665 | orchestrator | TASK [include_role : watcher] ************************************************** 2025-03-23 22:05:55.997671 | orchestrator | Sunday 23 March 2025 22:05:09 +0000 (0:00:00.649) 0:08:01.969 ********** 2025-03-23 22:05:55.997677 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997683 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997689 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997694 | orchestrator | 2025-03-23 22:05:55.997700 | orchestrator | TASK [include_role : zun] ****************************************************** 2025-03-23 22:05:55.997706 | orchestrator | Sunday 23 March 2025 22:05:09 +0000 (0:00:00.664) 0:08:02.633 ********** 2025-03-23 22:05:55.997712 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.997718 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.997724 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.997729 | orchestrator | 2025-03-23 22:05:55.997735 | orchestrator | RUNNING HANDLER [loadbalancer : Check IP addresses on the API interface] ******* 2025-03-23 22:05:55.997741 | orchestrator | Sunday 23 March 2025 22:05:10 +0000 (0:00:00.864) 0:08:03.498 ********** 2025-03-23 22:05:55.997747 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.997753 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.997759 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.997765 | orchestrator | 2025-03-23 22:05:55.997771 | orchestrator | RUNNING HANDLER [loadbalancer : Group HA nodes by status] ********************** 2025-03-23 22:05:55.997776 | orchestrator | Sunday 23 March 2025 22:05:11 +0000 (0:00:01.029) 0:08:04.527 ********** 2025-03-23 22:05:55.997782 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.997788 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.997794 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.997803 | orchestrator | 2025-03-23 22:05:55.997810 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup keepalived container] ************** 2025-03-23 22:05:55.997816 | orchestrator | Sunday 23 March 2025 22:05:11 +0000 (0:00:00.345) 0:08:04.872 ********** 2025-03-23 22:05:55.997822 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.997828 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.997834 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.997840 | orchestrator | 2025-03-23 22:05:55.997846 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup haproxy container] ***************** 2025-03-23 22:05:55.997852 | orchestrator | Sunday 23 March 2025 22:05:13 +0000 (0:00:01.412) 0:08:06.285 ********** 2025-03-23 22:05:55.997858 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.997867 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.997873 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.997879 | orchestrator | 2025-03-23 22:05:55.997885 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup proxysql container] **************** 2025-03-23 22:05:55.997891 | orchestrator | Sunday 23 March 2025 22:05:14 +0000 (0:00:01.344) 0:08:07.630 ********** 2025-03-23 22:05:55.997896 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.997902 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.997908 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.997914 | orchestrator | 2025-03-23 22:05:55.997920 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup haproxy container] **************** 2025-03-23 22:05:55.997926 | orchestrator | Sunday 23 March 2025 22:05:16 +0000 (0:00:01.305) 0:08:08.935 ********** 2025-03-23 22:05:55.997932 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.997937 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.997943 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.997949 | orchestrator | 2025-03-23 22:05:55.997955 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for backup haproxy to start] ************** 2025-03-23 22:05:55.997961 | orchestrator | Sunday 23 March 2025 22:05:26 +0000 (0:00:10.303) 0:08:19.238 ********** 2025-03-23 22:05:55.997967 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.997973 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.997979 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.997985 | orchestrator | 2025-03-23 22:05:55.997991 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup proxysql container] *************** 2025-03-23 22:05:55.997996 | orchestrator | Sunday 23 March 2025 22:05:27 +0000 (0:00:01.132) 0:08:20.371 ********** 2025-03-23 22:05:55.998002 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.998008 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.998041 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.998049 | orchestrator | 2025-03-23 22:05:55.998055 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for backup proxysql to start] ************* 2025-03-23 22:05:55.998061 | orchestrator | Sunday 23 March 2025 22:05:34 +0000 (0:00:07.259) 0:08:27.630 ********** 2025-03-23 22:05:55.998067 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.998072 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.998078 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.998084 | orchestrator | 2025-03-23 22:05:55.998090 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup keepalived container] ************* 2025-03-23 22:05:55.998096 | orchestrator | Sunday 23 March 2025 22:05:39 +0000 (0:00:04.394) 0:08:32.024 ********** 2025-03-23 22:05:55.998102 | orchestrator | changed: [testbed-node-1] 2025-03-23 22:05:55.998108 | orchestrator | changed: [testbed-node-2] 2025-03-23 22:05:55.998114 | orchestrator | changed: [testbed-node-0] 2025-03-23 22:05:55.998120 | orchestrator | 2025-03-23 22:05:55.998126 | orchestrator | RUNNING HANDLER [loadbalancer : Stop master haproxy container] ***************** 2025-03-23 22:05:55.998134 | orchestrator | Sunday 23 March 2025 22:05:48 +0000 (0:00:09.019) 0:08:41.044 ********** 2025-03-23 22:05:55.998140 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.998146 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.998152 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.998158 | orchestrator | 2025-03-23 22:05:55.998163 | orchestrator | RUNNING HANDLER [loadbalancer : Stop master proxysql container] **************** 2025-03-23 22:05:55.998169 | orchestrator | Sunday 23 March 2025 22:05:48 +0000 (0:00:00.702) 0:08:41.746 ********** 2025-03-23 22:05:55.998175 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.998184 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.998190 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.998196 | orchestrator | 2025-03-23 22:05:55.998202 | orchestrator | RUNNING HANDLER [loadbalancer : Stop master keepalived container] ************** 2025-03-23 22:05:55.998208 | orchestrator | Sunday 23 March 2025 22:05:49 +0000 (0:00:00.689) 0:08:42.436 ********** 2025-03-23 22:05:55.998214 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.998220 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.998226 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.998235 | orchestrator | 2025-03-23 22:05:55.998241 | orchestrator | RUNNING HANDLER [loadbalancer : Start master haproxy container] **************** 2025-03-23 22:05:55.998247 | orchestrator | Sunday 23 March 2025 22:05:50 +0000 (0:00:00.674) 0:08:43.111 ********** 2025-03-23 22:05:55.998253 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.998259 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.998265 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.998270 | orchestrator | 2025-03-23 22:05:55.998276 | orchestrator | RUNNING HANDLER [loadbalancer : Start master proxysql container] *************** 2025-03-23 22:05:55.998282 | orchestrator | Sunday 23 March 2025 22:05:50 +0000 (0:00:00.389) 0:08:43.500 ********** 2025-03-23 22:05:55.998288 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.998294 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.998300 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.998306 | orchestrator | 2025-03-23 22:05:55.998312 | orchestrator | RUNNING HANDLER [loadbalancer : Start master keepalived container] ************* 2025-03-23 22:05:55.998317 | orchestrator | Sunday 23 March 2025 22:05:51 +0000 (0:00:00.788) 0:08:44.289 ********** 2025-03-23 22:05:55.998323 | orchestrator | skipping: [testbed-node-0] 2025-03-23 22:05:55.998333 | orchestrator | skipping: [testbed-node-1] 2025-03-23 22:05:55.998339 | orchestrator | skipping: [testbed-node-2] 2025-03-23 22:05:55.998345 | orchestrator | 2025-03-23 22:05:55.998351 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for haproxy to listen on VIP] ************* 2025-03-23 22:05:55.998357 | orchestrator | Sunday 23 March 2025 22:05:52 +0000 (0:00:00.658) 0:08:44.947 ********** 2025-03-23 22:05:55.998363 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.998369 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.998375 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.998381 | orchestrator | 2025-03-23 22:05:55.998387 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for proxysql to listen on VIP] ************ 2025-03-23 22:05:55.998392 | orchestrator | Sunday 23 March 2025 22:05:53 +0000 (0:00:01.367) 0:08:46.314 ********** 2025-03-23 22:05:55.998398 | orchestrator | ok: [testbed-node-0] 2025-03-23 22:05:55.998404 | orchestrator | ok: [testbed-node-1] 2025-03-23 22:05:55.998410 | orchestrator | ok: [testbed-node-2] 2025-03-23 22:05:55.998416 | orchestrator | 2025-03-23 22:05:55.998422 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-23 22:05:55.998428 | orchestrator | testbed-node-0 : ok=127  changed=79  unreachable=0 failed=0 skipped=92  rescued=0 ignored=0 2025-03-23 22:05:55.998434 | orchestrator | testbed-node-1 : ok=126  changed=79  unreachable=0 failed=0 skipped=92  rescued=0 ignored=0 2025-03-23 22:05:55.998440 | orchestrator | testbed-node-2 : ok=126  changed=79  unreachable=0 failed=0 skipped=92  rescued=0 ignored=0 2025-03-23 22:05:55.998446 | orchestrator | 2025-03-23 22:05:55.998452 | orchestrator | 2025-03-23 22:05:55.998457 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-23 22:05:55.998463 | orchestrator | Sunday 23 March 2025 22:05:54 +0000 (0:00:01.371) 0:08:47.686 ********** 2025-03-23 22:05:55.998469 | orchestrator | =============================================================================== 2025-03-23 22:05:55.998475 | orchestrator | loadbalancer : Start backup haproxy container -------------------------- 10.30s 2025-03-23 22:05:55.998481 | orchestrator | loadbalancer : Start backup keepalived container ------------------------ 9.02s 2025-03-23 22:05:55.998487 | orchestrator | haproxy-config : Copying over skyline haproxy config -------------------- 8.81s 2025-03-23 22:05:55.998493 | orchestrator | haproxy-config : Copying over glance haproxy config --------------------- 8.31s 2025-03-23 22:05:55.998499 | orchestrator | haproxy-config : Copying over heat haproxy config ----------------------- 8.09s 2025-03-23 22:05:55.998505 | orchestrator | loadbalancer : Ensuring proxysql service config subdirectories exist ---- 7.53s 2025-03-23 22:05:55.998510 | orchestrator | haproxy-config : Configuring firewall for glance ------------------------ 7.41s 2025-03-23 22:05:55.998520 | orchestrator | loadbalancer : Start backup proxysql container -------------------------- 7.26s 2025-03-23 22:05:55.998526 | orchestrator | haproxy-config : Copying over opensearch haproxy config ----------------- 7.14s 2025-03-23 22:05:55.998532 | orchestrator | haproxy-config : Copying over cinder haproxy config --------------------- 7.04s 2025-03-23 22:05:55.998538 | orchestrator | haproxy-config : Add configuration for glance when using single external frontend --- 6.73s 2025-03-23 22:05:55.998544 | orchestrator | haproxy-config : Copying over nova haproxy config ----------------------- 6.45s 2025-03-23 22:05:55.998550 | orchestrator | haproxy-config : Copying over aodh haproxy config ----------------------- 6.41s 2025-03-23 22:05:55.998570 | orchestrator | haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config --- 6.39s 2025-03-23 22:05:55.998577 | orchestrator | haproxy-config : Copying over magnum haproxy config --------------------- 6.17s 2025-03-23 22:05:55.998583 | orchestrator | loadbalancer : Copying over proxysql config ----------------------------- 6.05s 2025-03-23 22:05:55.998589 | orchestrator | haproxy-config : Copying over neutron haproxy config -------------------- 5.89s 2025-03-23 22:05:55.998594 | orchestrator | haproxy-config : Copying over designate haproxy config ------------------ 5.76s 2025-03-23 22:05:55.998600 | orchestrator | haproxy-config : Copying over prometheus haproxy config ----------------- 5.69s 2025-03-23 22:05:55.998606 | orchestrator | haproxy-config : Copying over barbican haproxy config ------------------- 5.67s 2025-03-23 22:05:55.998614 | orchestrator | 2025-03-23 22:05:55 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:05:59.039831 | orchestrator | 2025-03-23 22:05:59 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:05:59.040627 | orchestrator | 2025-03-23 22:05:59 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:05:59.041546 | orchestrator | 2025-03-23 22:05:59 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:05:59.042723 | orchestrator | 2025-03-23 22:05:59 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:02.106231 | orchestrator | 2025-03-23 22:05:59 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:02.106523 | orchestrator | 2025-03-23 22:06:02 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:02.106623 | orchestrator | 2025-03-23 22:06:02 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:02.110081 | orchestrator | 2025-03-23 22:06:02 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:02.112380 | orchestrator | 2025-03-23 22:06:02 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:05.167847 | orchestrator | 2025-03-23 22:06:02 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:05.168077 | orchestrator | 2025-03-23 22:06:05 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:05.168856 | orchestrator | 2025-03-23 22:06:05 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:05.168886 | orchestrator | 2025-03-23 22:06:05 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:05.169492 | orchestrator | 2025-03-23 22:06:05 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:05.169645 | orchestrator | 2025-03-23 22:06:05 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:08.231800 | orchestrator | 2025-03-23 22:06:08 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:08.233273 | orchestrator | 2025-03-23 22:06:08 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:08.233316 | orchestrator | 2025-03-23 22:06:08 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:08.234337 | orchestrator | 2025-03-23 22:06:08 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:08.235974 | orchestrator | 2025-03-23 22:06:08 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:11.289290 | orchestrator | 2025-03-23 22:06:11 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:11.289968 | orchestrator | 2025-03-23 22:06:11 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:11.290013 | orchestrator | 2025-03-23 22:06:11 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:11.291303 | orchestrator | 2025-03-23 22:06:11 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:14.336628 | orchestrator | 2025-03-23 22:06:11 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:14.336768 | orchestrator | 2025-03-23 22:06:14 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:14.343437 | orchestrator | 2025-03-23 22:06:14 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:14.351440 | orchestrator | 2025-03-23 22:06:14 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:14.359499 | orchestrator | 2025-03-23 22:06:14 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:17.394081 | orchestrator | 2025-03-23 22:06:14 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:17.394209 | orchestrator | 2025-03-23 22:06:17 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:17.397872 | orchestrator | 2025-03-23 22:06:17 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:17.397905 | orchestrator | 2025-03-23 22:06:17 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:17.398370 | orchestrator | 2025-03-23 22:06:17 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:20.452790 | orchestrator | 2025-03-23 22:06:17 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:20.452921 | orchestrator | 2025-03-23 22:06:20 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:20.454115 | orchestrator | 2025-03-23 22:06:20 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:20.455793 | orchestrator | 2025-03-23 22:06:20 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:20.461744 | orchestrator | 2025-03-23 22:06:20 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:23.506195 | orchestrator | 2025-03-23 22:06:20 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:23.506349 | orchestrator | 2025-03-23 22:06:23 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:23.507415 | orchestrator | 2025-03-23 22:06:23 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:23.507449 | orchestrator | 2025-03-23 22:06:23 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:23.508290 | orchestrator | 2025-03-23 22:06:23 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:26.570294 | orchestrator | 2025-03-23 22:06:23 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:26.570430 | orchestrator | 2025-03-23 22:06:26 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:26.574053 | orchestrator | 2025-03-23 22:06:26 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:26.574092 | orchestrator | 2025-03-23 22:06:26 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:29.623948 | orchestrator | 2025-03-23 22:06:26 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:29.624058 | orchestrator | 2025-03-23 22:06:26 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:29.624093 | orchestrator | 2025-03-23 22:06:29 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:29.627231 | orchestrator | 2025-03-23 22:06:29 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:29.628063 | orchestrator | 2025-03-23 22:06:29 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:29.629450 | orchestrator | 2025-03-23 22:06:29 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:32.679026 | orchestrator | 2025-03-23 22:06:29 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:32.679190 | orchestrator | 2025-03-23 22:06:32 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:32.679793 | orchestrator | 2025-03-23 22:06:32 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:32.681466 | orchestrator | 2025-03-23 22:06:32 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:32.682499 | orchestrator | 2025-03-23 22:06:32 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:32.682638 | orchestrator | 2025-03-23 22:06:32 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:35.769621 | orchestrator | 2025-03-23 22:06:35 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:35.774696 | orchestrator | 2025-03-23 22:06:35 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:35.776666 | orchestrator | 2025-03-23 22:06:35 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:35.780779 | orchestrator | 2025-03-23 22:06:35 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:38.829186 | orchestrator | 2025-03-23 22:06:35 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:38.829323 | orchestrator | 2025-03-23 22:06:38 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:38.830388 | orchestrator | 2025-03-23 22:06:38 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:38.832285 | orchestrator | 2025-03-23 22:06:38 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:38.834072 | orchestrator | 2025-03-23 22:06:38 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:41.883877 | orchestrator | 2025-03-23 22:06:38 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:41.884009 | orchestrator | 2025-03-23 22:06:41 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:41.886724 | orchestrator | 2025-03-23 22:06:41 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:41.888033 | orchestrator | 2025-03-23 22:06:41 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:41.889103 | orchestrator | 2025-03-23 22:06:41 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:41.889590 | orchestrator | 2025-03-23 22:06:41 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:44.927423 | orchestrator | 2025-03-23 22:06:44 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:44.929297 | orchestrator | 2025-03-23 22:06:44 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:44.931460 | orchestrator | 2025-03-23 22:06:44 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:44.934376 | orchestrator | 2025-03-23 22:06:44 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:44.934688 | orchestrator | 2025-03-23 22:06:44 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:47.981602 | orchestrator | 2025-03-23 22:06:47 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:47.983024 | orchestrator | 2025-03-23 22:06:47 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:47.984692 | orchestrator | 2025-03-23 22:06:47 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:47.985930 | orchestrator | 2025-03-23 22:06:47 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:51.035908 | orchestrator | 2025-03-23 22:06:47 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:51.036036 | orchestrator | 2025-03-23 22:06:51 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:51.038071 | orchestrator | 2025-03-23 22:06:51 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:51.040654 | orchestrator | 2025-03-23 22:06:51 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:51.041952 | orchestrator | 2025-03-23 22:06:51 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:51.044104 | orchestrator | 2025-03-23 22:06:51 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:54.093750 | orchestrator | 2025-03-23 22:06:54 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:54.094137 | orchestrator | 2025-03-23 22:06:54 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:54.094186 | orchestrator | 2025-03-23 22:06:54 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:54.094745 | orchestrator | 2025-03-23 22:06:54 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:54.096071 | orchestrator | 2025-03-23 22:06:54 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:06:57.138972 | orchestrator | 2025-03-23 22:06:57 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:06:57.141419 | orchestrator | 2025-03-23 22:06:57 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:06:57.144727 | orchestrator | 2025-03-23 22:06:57 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:06:57.147582 | orchestrator | 2025-03-23 22:06:57 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:06:57.147839 | orchestrator | 2025-03-23 22:06:57 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:00.200888 | orchestrator | 2025-03-23 22:07:00 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:00.201770 | orchestrator | 2025-03-23 22:07:00 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:00.204184 | orchestrator | 2025-03-23 22:07:00 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:00.205921 | orchestrator | 2025-03-23 22:07:00 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:03.260266 | orchestrator | 2025-03-23 22:07:00 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:03.260410 | orchestrator | 2025-03-23 22:07:03 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:03.266371 | orchestrator | 2025-03-23 22:07:03 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:03.270212 | orchestrator | 2025-03-23 22:07:03 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:03.272416 | orchestrator | 2025-03-23 22:07:03 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:06.316729 | orchestrator | 2025-03-23 22:07:03 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:06.316868 | orchestrator | 2025-03-23 22:07:06 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:06.320349 | orchestrator | 2025-03-23 22:07:06 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:06.321665 | orchestrator | 2025-03-23 22:07:06 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:06.322716 | orchestrator | 2025-03-23 22:07:06 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:09.367395 | orchestrator | 2025-03-23 22:07:06 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:09.367529 | orchestrator | 2025-03-23 22:07:09 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:09.370903 | orchestrator | 2025-03-23 22:07:09 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:09.372216 | orchestrator | 2025-03-23 22:07:09 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:09.378554 | orchestrator | 2025-03-23 22:07:09 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:12.450979 | orchestrator | 2025-03-23 22:07:09 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:12.451108 | orchestrator | 2025-03-23 22:07:12 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:12.454615 | orchestrator | 2025-03-23 22:07:12 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:12.454657 | orchestrator | 2025-03-23 22:07:12 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:12.454700 | orchestrator | 2025-03-23 22:07:12 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:15.489954 | orchestrator | 2025-03-23 22:07:12 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:15.490143 | orchestrator | 2025-03-23 22:07:15 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:15.490856 | orchestrator | 2025-03-23 22:07:15 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:15.493879 | orchestrator | 2025-03-23 22:07:15 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:15.496665 | orchestrator | 2025-03-23 22:07:15 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:18.556825 | orchestrator | 2025-03-23 22:07:15 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:18.556964 | orchestrator | 2025-03-23 22:07:18 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:18.559177 | orchestrator | 2025-03-23 22:07:18 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:18.561518 | orchestrator | 2025-03-23 22:07:18 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:18.564225 | orchestrator | 2025-03-23 22:07:18 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:18.564675 | orchestrator | 2025-03-23 22:07:18 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:21.613733 | orchestrator | 2025-03-23 22:07:21 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:21.614588 | orchestrator | 2025-03-23 22:07:21 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:21.617127 | orchestrator | 2025-03-23 22:07:21 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:21.618651 | orchestrator | 2025-03-23 22:07:21 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:24.667994 | orchestrator | 2025-03-23 22:07:21 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:24.668127 | orchestrator | 2025-03-23 22:07:24 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:24.668635 | orchestrator | 2025-03-23 22:07:24 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:24.669671 | orchestrator | 2025-03-23 22:07:24 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:24.670002 | orchestrator | 2025-03-23 22:07:24 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:27.723113 | orchestrator | 2025-03-23 22:07:24 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:27.723258 | orchestrator | 2025-03-23 22:07:27 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:27.724768 | orchestrator | 2025-03-23 22:07:27 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:27.727008 | orchestrator | 2025-03-23 22:07:27 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:27.728315 | orchestrator | 2025-03-23 22:07:27 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:30.773185 | orchestrator | 2025-03-23 22:07:27 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:30.773321 | orchestrator | 2025-03-23 22:07:30 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:30.775334 | orchestrator | 2025-03-23 22:07:30 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:30.776805 | orchestrator | 2025-03-23 22:07:30 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:30.778471 | orchestrator | 2025-03-23 22:07:30 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:33.831279 | orchestrator | 2025-03-23 22:07:30 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:33.831416 | orchestrator | 2025-03-23 22:07:33 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:33.834195 | orchestrator | 2025-03-23 22:07:33 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:33.834322 | orchestrator | 2025-03-23 22:07:33 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:33.835685 | orchestrator | 2025-03-23 22:07:33 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:33.835904 | orchestrator | 2025-03-23 22:07:33 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:36.882556 | orchestrator | 2025-03-23 22:07:36 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:36.883355 | orchestrator | 2025-03-23 22:07:36 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:36.884828 | orchestrator | 2025-03-23 22:07:36 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:36.886811 | orchestrator | 2025-03-23 22:07:36 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:39.942215 | orchestrator | 2025-03-23 22:07:36 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:39.942347 | orchestrator | 2025-03-23 22:07:39 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:39.943316 | orchestrator | 2025-03-23 22:07:39 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:39.944108 | orchestrator | 2025-03-23 22:07:39 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:39.945366 | orchestrator | 2025-03-23 22:07:39 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:39.945734 | orchestrator | 2025-03-23 22:07:39 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:42.999902 | orchestrator | 2025-03-23 22:07:42 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:43.000510 | orchestrator | 2025-03-23 22:07:43 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:43.001705 | orchestrator | 2025-03-23 22:07:43 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:43.002697 | orchestrator | 2025-03-23 22:07:43 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:46.075433 | orchestrator | 2025-03-23 22:07:43 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:46.075617 | orchestrator | 2025-03-23 22:07:46 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:46.075762 | orchestrator | 2025-03-23 22:07:46 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:46.079112 | orchestrator | 2025-03-23 22:07:46 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:49.120689 | orchestrator | 2025-03-23 22:07:46 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:49.120801 | orchestrator | 2025-03-23 22:07:46 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:49.120836 | orchestrator | 2025-03-23 22:07:49 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:49.125178 | orchestrator | 2025-03-23 22:07:49 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:49.127114 | orchestrator | 2025-03-23 22:07:49 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:49.129531 | orchestrator | 2025-03-23 22:07:49 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:52.197071 | orchestrator | 2025-03-23 22:07:49 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:52.197204 | orchestrator | 2025-03-23 22:07:52 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:52.199833 | orchestrator | 2025-03-23 22:07:52 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:52.202168 | orchestrator | 2025-03-23 22:07:52 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:52.204152 | orchestrator | 2025-03-23 22:07:52 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:52.204706 | orchestrator | 2025-03-23 22:07:52 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:55.258317 | orchestrator | 2025-03-23 22:07:55 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:55.260349 | orchestrator | 2025-03-23 22:07:55 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:55.261415 | orchestrator | 2025-03-23 22:07:55 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:55.262969 | orchestrator | 2025-03-23 22:07:55 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:55.263468 | orchestrator | 2025-03-23 22:07:55 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:07:58.340069 | orchestrator | 2025-03-23 22:07:58 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:07:58.341267 | orchestrator | 2025-03-23 22:07:58 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:07:58.344636 | orchestrator | 2025-03-23 22:07:58 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:07:58.346531 | orchestrator | 2025-03-23 22:07:58 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:07:58.346930 | orchestrator | 2025-03-23 22:07:58 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:08:01.410895 | orchestrator | 2025-03-23 22:08:01 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:08:01.413269 | orchestrator | 2025-03-23 22:08:01 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:08:01.413305 | orchestrator | 2025-03-23 22:08:01 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:08:01.413328 | orchestrator | 2025-03-23 22:08:01 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:08:04.473485 | orchestrator | 2025-03-23 22:08:01 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:08:04.473662 | orchestrator | 2025-03-23 22:08:04 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:08:04.475166 | orchestrator | 2025-03-23 22:08:04 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:08:04.477229 | orchestrator | 2025-03-23 22:08:04 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:08:04.478756 | orchestrator | 2025-03-23 22:08:04 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:08:04.479187 | orchestrator | 2025-03-23 22:08:04 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:08:07.543879 | orchestrator | 2025-03-23 22:08:07 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:08:07.545046 | orchestrator | 2025-03-23 22:08:07 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:08:07.546757 | orchestrator | 2025-03-23 22:08:07 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:08:07.548458 | orchestrator | 2025-03-23 22:08:07 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:08:10.602399 | orchestrator | 2025-03-23 22:08:07 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:08:10.602540 | orchestrator | 2025-03-23 22:08:10 | INFO  | Task f712b796-38d0-4a95-8f80-410bcda57c10 is in state STARTED 2025-03-23 22:08:10.606092 | orchestrator | 2025-03-23 22:08:10 | INFO  | Task f3cd4eb5-5489-4d5d-9d13-5fb3ddbc8b2b is in state STARTED 2025-03-23 22:08:10.606164 | orchestrator | 2025-03-23 22:08:10 | INFO  | Task a2183393-70d7-4659-8915-2453d061a5ea is in state STARTED 2025-03-23 22:08:10.606331 | orchestrator | 2025-03-23 22:08:10 | INFO  | Task 85c31866-9d33-48b1-86c6-afdecf46b015 is in state STARTED 2025-03-23 22:08:10.607380 | orchestrator | 2025-03-23 22:08:10 | INFO  | Wait 1 second(s) until the next check 2025-03-23 22:08:11.248782 | RUN END RESULT_TIMED_OUT: [untrusted : github.com/osism/testbed/playbooks/deploy.yml@main] 2025-03-23 22:08:11.254643 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2025-03-23 22:08:11.986328 | 2025-03-23 22:08:11.986497 | PLAY [Post output play] 2025-03-23 22:08:12.016725 | 2025-03-23 22:08:12.016866 | LOOP [stage-output : Register sources] 2025-03-23 22:08:12.102100 | 2025-03-23 22:08:12.102443 | TASK [stage-output : Check sudo] 2025-03-23 22:08:12.810349 | orchestrator | sudo: a password is required 2025-03-23 22:08:13.144625 | orchestrator | ok: Runtime: 0:00:00.012400 2025-03-23 22:08:13.157776 | 2025-03-23 22:08:13.157910 | LOOP [stage-output : Set source and destination for files and folders] 2025-03-23 22:08:13.195465 | 2025-03-23 22:08:13.195675 | TASK [stage-output : Build a list of source, dest dictionaries] 2025-03-23 22:08:13.288659 | orchestrator | ok 2025-03-23 22:08:13.298496 | 2025-03-23 22:08:13.298604 | LOOP [stage-output : Ensure target folders exist] 2025-03-23 22:08:13.746320 | orchestrator | ok: "docs" 2025-03-23 22:08:13.746677 | 2025-03-23 22:08:13.985960 | orchestrator | ok: "artifacts" 2025-03-23 22:08:14.216378 | orchestrator | ok: "logs" 2025-03-23 22:08:14.226786 | 2025-03-23 22:08:14.226904 | LOOP [stage-output : Copy files and folders to staging folder] 2025-03-23 22:08:14.275005 | 2025-03-23 22:08:14.275484 | TASK [stage-output : Make all log files readable] 2025-03-23 22:08:14.550005 | orchestrator | ok 2025-03-23 22:08:14.559273 | 2025-03-23 22:08:14.559379 | TASK [stage-output : Rename log files that match extensions_to_txt] 2025-03-23 22:08:14.604470 | orchestrator | skipping: Conditional result was False 2025-03-23 22:08:14.621587 | 2025-03-23 22:08:14.621744 | TASK [stage-output : Discover log files for compression] 2025-03-23 22:08:14.657710 | orchestrator | skipping: Conditional result was False 2025-03-23 22:08:14.675346 | 2025-03-23 22:08:14.675476 | LOOP [stage-output : Archive everything from logs] 2025-03-23 22:08:14.760776 | 2025-03-23 22:08:14.760931 | PLAY [Post cleanup play] 2025-03-23 22:08:14.785792 | 2025-03-23 22:08:14.785907 | TASK [Set cloud fact (Zuul deployment)] 2025-03-23 22:08:14.853297 | orchestrator | ok 2025-03-23 22:08:14.863815 | 2025-03-23 22:08:14.863915 | TASK [Set cloud fact (local deployment)] 2025-03-23 22:08:14.897910 | orchestrator | skipping: Conditional result was False 2025-03-23 22:08:14.910776 | 2025-03-23 22:08:14.910895 | TASK [Clean the cloud environment] 2025-03-23 22:08:15.519462 | orchestrator | 2025-03-23 22:08:15 - clean up servers 2025-03-23 22:08:16.348992 | orchestrator | 2025-03-23 22:08:16 - testbed-manager 2025-03-23 22:08:16.442008 | orchestrator | 2025-03-23 22:08:16 - testbed-node-1 2025-03-23 22:08:16.533002 | orchestrator | 2025-03-23 22:08:16 - testbed-node-0 2025-03-23 22:08:16.636353 | orchestrator | 2025-03-23 22:08:16 - testbed-node-4 2025-03-23 22:08:16.734326 | orchestrator | 2025-03-23 22:08:16 - testbed-node-3 2025-03-23 22:08:16.834204 | orchestrator | 2025-03-23 22:08:16 - testbed-node-2 2025-03-23 22:08:16.930518 | orchestrator | 2025-03-23 22:08:16 - testbed-node-5 2025-03-23 22:08:17.030391 | orchestrator | 2025-03-23 22:08:17 - clean up keypairs 2025-03-23 22:08:17.051445 | orchestrator | 2025-03-23 22:08:17 - testbed 2025-03-23 22:08:17.080177 | orchestrator | 2025-03-23 22:08:17 - wait for servers to be gone 2025-03-23 22:08:30.969717 | orchestrator | 2025-03-23 22:08:30 - clean up ports 2025-03-23 22:08:31.183657 | orchestrator | 2025-03-23 22:08:31 - 0342b2ff-3137-484e-b076-3972428420ea 2025-03-23 22:08:31.460135 | orchestrator | 2025-03-23 22:08:31 - 41c27fe4-dee6-403b-9d84-7c9a1140fe25 2025-03-23 22:08:31.728951 | orchestrator | 2025-03-23 22:08:31 - 7abb6e45-c7c7-4ee1-b3e9-af7be1538613 2025-03-23 22:08:32.011555 | orchestrator | 2025-03-23 22:08:32 - 917c662b-ba88-4d93-8d2d-3b4e084a7faa 2025-03-23 22:08:32.365830 | orchestrator | 2025-03-23 22:08:32 - 9cbb8f06-6f3c-4982-89ba-6c1f49243bb7 2025-03-23 22:08:32.571157 | orchestrator | 2025-03-23 22:08:32 - bde1bfc6-e8bf-4a2c-a7f6-d75d46467f8b 2025-03-23 22:08:32.858487 | orchestrator | 2025-03-23 22:08:32 - e61436fa-3f64-4b99-a28e-47af35fdbc09 2025-03-23 22:08:33.152222 | orchestrator | 2025-03-23 22:08:33 - clean up volumes 2025-03-23 22:08:33.281293 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-3-node-base 2025-03-23 22:08:33.327708 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-1-node-base 2025-03-23 22:08:33.368607 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-0-node-base 2025-03-23 22:08:33.405819 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-4-node-base 2025-03-23 22:08:33.445541 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-5-node-base 2025-03-23 22:08:33.490173 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-2-node-base 2025-03-23 22:08:33.531755 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-manager-base 2025-03-23 22:08:33.571147 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-12-node-0 2025-03-23 22:08:33.616352 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-6-node-0 2025-03-23 22:08:33.658721 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-7-node-1 2025-03-23 22:08:33.699372 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-0-node-0 2025-03-23 22:08:33.742079 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-8-node-2 2025-03-23 22:08:33.787933 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-11-node-5 2025-03-23 22:08:33.829060 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-17-node-5 2025-03-23 22:08:33.875259 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-3-node-3 2025-03-23 22:08:33.918932 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-4-node-4 2025-03-23 22:08:33.959122 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-16-node-4 2025-03-23 22:08:33.999072 | orchestrator | 2025-03-23 22:08:33 - testbed-volume-9-node-3 2025-03-23 22:08:34.043084 | orchestrator | 2025-03-23 22:08:34 - testbed-volume-10-node-4 2025-03-23 22:08:34.088531 | orchestrator | 2025-03-23 22:08:34 - testbed-volume-1-node-1 2025-03-23 22:08:34.135223 | orchestrator | 2025-03-23 22:08:34 - testbed-volume-13-node-1 2025-03-23 22:08:34.182189 | orchestrator | 2025-03-23 22:08:34 - testbed-volume-15-node-3 2025-03-23 22:08:34.228038 | orchestrator | 2025-03-23 22:08:34 - testbed-volume-2-node-2 2025-03-23 22:08:34.272839 | orchestrator | 2025-03-23 22:08:34 - testbed-volume-14-node-2 2025-03-23 22:08:34.327221 | orchestrator | 2025-03-23 22:08:34 - testbed-volume-5-node-5 2025-03-23 22:08:34.373019 | orchestrator | 2025-03-23 22:08:34 - disconnect routers 2025-03-23 22:08:34.443964 | orchestrator | 2025-03-23 22:08:34 - testbed 2025-03-23 22:08:35.159630 | orchestrator | 2025-03-23 22:08:35 - clean up subnets 2025-03-23 22:08:35.376772 | orchestrator | 2025-03-23 22:08:35 - subnet-testbed-management 2025-03-23 22:08:35.509205 | orchestrator | 2025-03-23 22:08:35 - clean up networks 2025-03-23 22:08:35.704071 | orchestrator | 2025-03-23 22:08:35 - net-testbed-management 2025-03-23 22:08:36.706187 | orchestrator | 2025-03-23 22:08:36 - clean up security groups 2025-03-23 22:08:36.746245 | orchestrator | 2025-03-23 22:08:36 - testbed-management 2025-03-23 22:08:36.842205 | orchestrator | 2025-03-23 22:08:36 - testbed-node 2025-03-23 22:08:36.958240 | orchestrator | 2025-03-23 22:08:36 - clean up floating ips 2025-03-23 22:08:36.994157 | orchestrator | 2025-03-23 22:08:36 - 81.163.193.116 2025-03-23 22:08:37.454262 | orchestrator | 2025-03-23 22:08:37 - clean up routers 2025-03-23 22:08:37.501831 | orchestrator | 2025-03-23 22:08:37 - testbed 2025-03-23 22:08:38.467136 | orchestrator | changed 2025-03-23 22:08:38.511088 | 2025-03-23 22:08:38.511215 | PLAY RECAP 2025-03-23 22:08:38.511270 | orchestrator | ok: 6 changed: 2 unreachable: 0 failed: 0 skipped: 7 rescued: 0 ignored: 0 2025-03-23 22:08:38.511296 | 2025-03-23 22:08:38.615655 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2025-03-23 22:08:38.623854 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2025-03-23 22:08:39.370753 | 2025-03-23 22:08:39.370899 | PLAY [Base post-fetch] 2025-03-23 22:08:39.400122 | 2025-03-23 22:08:39.400285 | TASK [fetch-output : Set log path for multiple nodes] 2025-03-23 22:08:39.467326 | orchestrator | skipping: Conditional result was False 2025-03-23 22:08:39.485285 | 2025-03-23 22:08:39.485471 | TASK [fetch-output : Set log path for single node] 2025-03-23 22:08:39.549740 | orchestrator | ok 2025-03-23 22:08:39.558847 | 2025-03-23 22:08:39.558959 | LOOP [fetch-output : Ensure local output dirs] 2025-03-23 22:08:40.020518 | orchestrator -> localhost | ok: "/var/lib/zuul/builds/86516234068b4cbdb5f6e1ee4173655a/work/logs" 2025-03-23 22:08:40.316300 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/86516234068b4cbdb5f6e1ee4173655a/work/artifacts" 2025-03-23 22:08:40.598378 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/86516234068b4cbdb5f6e1ee4173655a/work/docs" 2025-03-23 22:08:40.621557 | 2025-03-23 22:08:40.621702 | LOOP [fetch-output : Collect logs, artifacts and docs] 2025-03-23 22:08:41.436035 | orchestrator | changed: .d..t...... ./ 2025-03-23 22:08:41.436341 | orchestrator | changed: All items complete 2025-03-23 22:08:41.436451 | 2025-03-23 22:08:42.028512 | orchestrator | changed: .d..t...... ./ 2025-03-23 22:08:42.645110 | orchestrator | changed: .d..t...... ./ 2025-03-23 22:08:42.673635 | 2025-03-23 22:08:42.673758 | LOOP [merge-output-to-logs : Move artifacts and docs to logs dir] 2025-03-23 22:08:42.719036 | orchestrator | skipping: Conditional result was False 2025-03-23 22:08:42.725940 | orchestrator | skipping: Conditional result was False 2025-03-23 22:08:42.779993 | 2025-03-23 22:08:42.780098 | PLAY RECAP 2025-03-23 22:08:42.780174 | orchestrator | ok: 3 changed: 2 unreachable: 0 failed: 0 skipped: 2 rescued: 0 ignored: 0 2025-03-23 22:08:42.780207 | 2025-03-23 22:08:42.895672 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2025-03-23 22:08:42.898943 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2025-03-23 22:08:43.580766 | 2025-03-23 22:08:43.580932 | PLAY [Base post] 2025-03-23 22:08:43.609361 | 2025-03-23 22:08:43.609494 | TASK [remove-build-sshkey : Remove the build SSH key from all nodes] 2025-03-23 22:08:44.423354 | orchestrator | changed 2025-03-23 22:08:44.460227 | 2025-03-23 22:08:44.460335 | PLAY RECAP 2025-03-23 22:08:44.460419 | orchestrator | ok: 1 changed: 1 unreachable: 0 failed: 0 skipped: 0 rescued: 0 ignored: 0 2025-03-23 22:08:44.460486 | 2025-03-23 22:08:44.565765 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2025-03-23 22:08:44.573674 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-logs.yaml@main] 2025-03-23 22:08:45.329698 | 2025-03-23 22:08:45.329848 | PLAY [Base post-logs] 2025-03-23 22:08:45.345752 | 2025-03-23 22:08:45.345872 | TASK [generate-zuul-manifest : Generate Zuul manifest] 2025-03-23 22:08:45.798237 | localhost | changed 2025-03-23 22:08:45.805365 | 2025-03-23 22:08:45.805564 | TASK [generate-zuul-manifest : Return Zuul manifest URL to Zuul] 2025-03-23 22:08:45.848583 | localhost | ok 2025-03-23 22:08:45.859275 | 2025-03-23 22:08:45.859421 | TASK [Set zuul-log-path fact] 2025-03-23 22:08:45.893035 | localhost | ok 2025-03-23 22:08:45.911740 | 2025-03-23 22:08:45.911871 | TASK [set-zuul-log-path-fact : Set log path for a build] 2025-03-23 22:08:45.950676 | localhost | ok 2025-03-23 22:08:45.958067 | 2025-03-23 22:08:45.958208 | TASK [upload-logs : Create log directories] 2025-03-23 22:08:46.463441 | localhost | changed 2025-03-23 22:08:46.471512 | 2025-03-23 22:08:46.471681 | TASK [upload-logs : Ensure logs are readable before uploading] 2025-03-23 22:08:46.969288 | localhost -> localhost | ok: Runtime: 0:00:00.006887 2025-03-23 22:08:46.980569 | 2025-03-23 22:08:46.980736 | TASK [upload-logs : Upload logs to log server] 2025-03-23 22:08:47.545963 | localhost | Output suppressed because no_log was given 2025-03-23 22:08:47.552335 | 2025-03-23 22:08:47.552550 | LOOP [upload-logs : Compress console log and json output] 2025-03-23 22:08:47.626844 | localhost | skipping: Conditional result was False 2025-03-23 22:08:47.643625 | localhost | skipping: Conditional result was False 2025-03-23 22:08:47.655794 | 2025-03-23 22:08:47.655947 | LOOP [upload-logs : Upload compressed console log and json output] 2025-03-23 22:08:47.725639 | localhost | skipping: Conditional result was False 2025-03-23 22:08:47.726358 | 2025-03-23 22:08:47.737948 | localhost | skipping: Conditional result was False 2025-03-23 22:08:47.752867 | 2025-03-23 22:08:47.753065 | LOOP [upload-logs : Upload console log and json output]