You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Error: error attaching block storage (9e616948-a562-47d4-8876-3fe205d4fb3d): {"error":"unable to attach: Server is currently locked","status":400}
with vultr_block_storage.my_block_storage
on vultr_instance.tf line 43, in resource "vultr_block_storage" "my_block_storage": resource "vultr_block_storage" "my_block_storage" {
To Reproduce
I believe only the simultaneous creation of an instance and attaching a block storage device to that instance is relevant.
This sometimes succeeds, however.
Expected behavior
The attach operation should wait until the server is ready (not locked)
Desktop (please complete the following information where applicable:
OS: Ubuntu
Terraform 1.8
vultr provider 2.19.0
The text was updated successfully, but these errors were encountered:
Also seeing this on OpenTofu v1.8.1 with registry.terraform.io/vultr/vultr v2.21.0, I tried adding the provisioner stanzas shown below as a workaround which makes it somewhat more reliable, but we often end up needing to reapply after a short delay - I feel the workaround is only acting as a delay and isn't long enough sometimes.
EDIT - After adding another sleep, our issue now appears to be that the server is locked when attaching multiple blockstorage in one shot.
# Pause for 120s to allow all servers to become unlockedresource"time_sleep""wait_120_seconds" {
create_duration="120s"destroy_duration="120s"
}
# Provision and attach blockstorage# Blockstorage for k8s-internal ceph cluster on control plane nodesresource"vultr_block_storage""control_plane_instance" {
depends_on=[ time_sleep.wait_120_seconds, vultr_instance.control_plane_instance ]
count=length(vultr_instance.control_plane_instance) * var.CONTROL_PLANE_CEPH_BLOCK_COUNTlabel="${vultr_instance.control_plane_instance[floor(count.index/ var.CONTROL_PLANE_CEPH_BLOCK_COUNT)].label}"size_gb=var.CONTROL_PLANE_CEPH_BLOCK_SIZEregion=var.REGIONattached_to_instance=vultr_instance.control_plane_instance[floor(count.index/ var.CONTROL_PLANE_CEPH_BLOCK_COUNT)].idblock_type=var.BLOCK_TYPElive=true
}
Thus I think this is a more general issue centered around server lock status, and not a simple race condition on sub creation.
Describe the bug
I am creating a new instance and attaching a block storage device to the instance at the same time.
I get this error
To Reproduce
I believe only the simultaneous creation of an instance and attaching a block storage device to that instance is relevant.
This sometimes succeeds, however.
Expected behavior
The attach operation should wait until the server is ready (not locked)
Desktop (please complete the following information where applicable:
The text was updated successfully, but these errors were encountered: