Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nfs-server module bug: "auto_delete_disk" option not working #3566

Open
scott-nag opened this issue Jan 20, 2025 · 0 comments
Open

nfs-server module bug: "auto_delete_disk" option not working #3566

scott-nag opened this issue Jan 20, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@scott-nag
Copy link
Collaborator

Describe the bug

Relevant documentation

Docs state: "NOTE: auto_delete_disk is set to true in this example, which means that running terraform destroy also deletes the disk. To retain the disk after terraform destroy either set this to false or don't include the settings so it defaults to false. Note that with auto_delete_disk: false, you will need to manually delete the disk after destroying a deployment group with nfs-server."

However, the shared volume is deleted regardless of auto_delete_disk: false being set in blueprint.

I do notice that the docs state "WARNING: This module has only been tested against the HPC centos7 OS disk image (the default). Using other images may work, but have not been verified." and have tested using an older Centos 7 image with the same result (disk is deleted).

Steps to reproduce

Steps to reproduce the behavior:

  1. Create a new cluster with a share created by the nfs-server module (example below)
  2. Delete the cluster
  3. Once deleted the volume will be gone too despite 'auto_delete_disk' being explicitly set to false

Expected behavior

The volume should remain for future use.

Actual behavior

The volume is deleted.

Version (gcluster --version)

Built from 'develop' branch.
Commit info: v1.45.0-25-gf5349cac
Terraform version: 1.9.8

Blueprint

If applicable, attach or paste the blueprint YAML used to produce the bug.

---

blueprint_name: hpc-slurm

vars:
  project_id: your-project-here
  deployment_name: hpc-slurm
  region: us-central1
  zone: us-central1-a

deployment_groups:
- group: primary
  modules:
  - id: network
    source: modules/network/vpc

  - id: homefs
    source: community/modules/file-system/nfs-server
    use: [network]
    settings:
      auto_delete_disk: false
      local_mounts: [/home]
      # instance_image:
      #   family: schedmd-slurm-21-08-8-hpc-centos-7
      #   project: schedmd-slurm-public

  - id: debug_nodeset
    source: community/modules/compute/schedmd-slurm-gcp-v6-nodeset
    use: [network]
    settings:
      node_count_dynamic_max: 4
      machine_type: n2-standard-2
      enable_placement: false
      allow_automatic_updates: false
      instance_image_custom: false
      instance_image:
        project: schedmd-slurm-public
        family: slurm-gcp-6-8-hpc-rocky-linux-8

  - id: debug_partition
    source: community/modules/compute/schedmd-slurm-gcp-v6-partition
    use:
    - debug_nodeset
    settings:
      partition_name: debug
      exclusive: false
      is_default: true

  - id: slurm_login
    source: community/modules/scheduler/schedmd-slurm-gcp-v6-login
    use: [network]
    settings:
      machine_type: n2-standard-4
      enable_login_public_ips: true

  - id: slurm_controller
    source: community/modules/scheduler/schedmd-slurm-gcp-v6-controller
    use:
    - network
    - debug_partition
    - homefs
    - slurm_login
    settings:
      enable_controller_public_ips: true

Output and logs

N/A

Screenshots

N/A

Execution environment

  • OS: macOS
  • Shell: zsh
  • go version: go version go1.23.4 darwin/arm64

If there's questions, clarification, or further testing I can provide please let me know. Many thanks.

@scott-nag scott-nag added the bug Something isn't working label Jan 20, 2025
@abbas1902 abbas1902 self-assigned this Feb 6, 2025
@abbas1902 abbas1902 removed their assignment Feb 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants