From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from firstgate.proxmox.com (firstgate.proxmox.com [IPv6:2a01:7e0:0:424::9]) by lore.proxmox.com (Postfix) with ESMTPS id ACDA21FF17E for ; Thu, 2 Oct 2025 11:36:42 +0200 (CEST) Received: from firstgate.proxmox.com (localhost [127.0.0.1]) by firstgate.proxmox.com (Proxmox) with ESMTP id 6A456B9C8; Thu, 2 Oct 2025 11:36:49 +0200 (CEST) Mime-Version: 1.0 Date: Thu, 02 Oct 2025 11:36:16 +0200 Message-Id: To: "Lorne Guse" , "Proxmox VE development discussion" X-Mailer: aerc 0.20.0 References: In-Reply-To: From: "Shannon Sterz" X-Bm-Milter-Handled: 55990f41-d878-4baa-be0a-ee34c49e34d2 X-Bm-Transport-Timestamp: 1759397752989 X-SPAM-LEVEL: Spam detection results: 0 AWL 0.056 Adjusted score from AWL reputation of From: address BAYES_00 -1.9 Bayes spam probability is 0 to 1% DMARC_MISSING 0.1 Missing DMARC policy KAM_DMARC_STATUS 0.01 Test Rule for DKIM or SPF Failure with Strict Alignment RCVD_IN_VALIDITY_CERTIFIED_BLOCKED 0.001 ADMINISTRATOR NOTICE: The query to Validity was blocked. See https://knowledge.validity.com/hc/en-us/articles/20961730681243 for more information. RCVD_IN_VALIDITY_RPBL_BLOCKED 0.001 ADMINISTRATOR NOTICE: The query to Validity was blocked. See https://knowledge.validity.com/hc/en-us/articles/20961730681243 for more information. RCVD_IN_VALIDITY_SAFE_BLOCKED 0.001 ADMINISTRATOR NOTICE: The query to Validity was blocked. See https://knowledge.validity.com/hc/en-us/articles/20961730681243 for more information. SPF_HELO_NONE 0.001 SPF: HELO does not publish an SPF Record SPF_PASS -0.001 SPF: sender matches SPF record URIBL_BLOCKED 0.001 ADMINISTRATOR NOTICE: The query to URIBL was blocked. See http://wiki.apache.org/spamassassin/DnsBlocklists#dnsbl-block for more information. [proxmox.com] Subject: Re: [pve-devel] busy dataset when trying the migrate iscsi disk X-BeenThere: pve-devel@lists.proxmox.com X-Mailman-Version: 2.1.29 Precedence: list List-Id: Proxmox VE development discussion List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Reply-To: Proxmox VE development discussion Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: 7bit Errors-To: pve-devel-bounces@lists.proxmox.com Sender: "pve-devel" Hi Lorne, this is the development discussion list. If you need assistance with your configuration, please take a look at the user forum [1]. If you want to report a bug, please open a report over in our Bugzilla [2]. Best regards, Shannon [1]: https://forum.proxmox.com/ [2]: https://bugzilla.proxmox.com/ On Mon Sep 15, 2025 at 5:34 AM CEST, Lorne Guse wrote: > I'm working on TrueNAS over iSCSI for Proxmox and have run into an issue migrating disks. When trying to delete the old storage, which has just successfully been transfered, the iscsidirect connection must remain open because we are getting: > > cannot destroy 'slow/vm-188-disk-0': dataset is busy > > Is there a way to ensure the iscsidirect connection is closed before trying to delete the underlying zfs dataset? > > > TrueNAS [INFO] : zvol/slow/vm-188-disk-0 with key 'path' found : /dev/zvol/slow/vm-188-disk-0 > TrueNAS [INFO] : zvol/slow/vm-188-disk-0 with key 'lunid' found : 3 > create full clone of drive scsi0 (iSCSI-vm-storage-HDD:vm-188-disk-0) > TrueNAS [INFO] : /dev/zvol/fast/vm-188-disk-0 > TrueNAS [INFO] : Created LUN: /dev/zvol/fast/vm-188-disk-0 : T2:E236:L4 > TrueNAS [INFO] : zvol/fast/vm-188-disk-0 with key 'path' found : /dev/zvol/fast/vm-188-disk-0 > TrueNAS [INFO] : zvol/fast/vm-188-disk-0 with key 'lunid' found : 4 > drive mirror is starting for drive-scsi0 > mirror-scsi0: transferred 924.0 MiB of 32.0 GiB (2.82%) in 1s > mirror-scsi0: transferred 1.8 GiB of 32.0 GiB (5.70%) in 2s > mirror-scsi0: transferred 2.7 GiB of 32.0 GiB (8.56%) in 3s > mirror-scsi0: transferred 3.7 GiB of 32.0 GiB (11.68%) in 4s > mirror-scsi0: transferred 4.7 GiB of 32.0 GiB (14.54%) in 5s > mirror-scsi0: transferred 5.5 GiB of 32.0 GiB (17.22%) in 6s > mirror-scsi0: transferred 6.3 GiB of 32.0 GiB (19.71%) in 7s > mirror-scsi0: transferred 7.1 GiB of 32.0 GiB (22.29%) in 8s > mirror-scsi0: transferred 8.0 GiB of 32.0 GiB (24.97%) in 9s > mirror-scsi0: transferred 8.8 GiB of 32.0 GiB (27.55%) in 10s > mirror-scsi0: transferred 9.6 GiB of 32.0 GiB (30.06%) in 11s > mirror-scsi0: transferred 10.6 GiB of 32.0 GiB (33.00%) in 12s > mirror-scsi0: transferred 11.5 GiB of 32.0 GiB (36.04%) in 13s > mirror-scsi0: transferred 12.5 GiB of 32.0 GiB (38.97%) in 14s > mirror-scsi0: transferred 13.4 GiB of 32.0 GiB (41.72%) in 15s > mirror-scsi0: transferred 14.2 GiB of 32.0 GiB (44.43%) in 16s > mirror-scsi0: transferred 15.1 GiB of 32.0 GiB (47.25%) in 17s > mirror-scsi0: transferred 16.1 GiB of 32.0 GiB (50.19%) in 18s > mirror-scsi0: transferred 17.0 GiB of 32.0 GiB (53.15%) in 19s > mirror-scsi0: transferred 18.0 GiB of 32.0 GiB (56.16%) in 20s > mirror-scsi0: transferred 18.8 GiB of 32.0 GiB (58.88%) in 21s > mirror-scsi0: transferred 19.8 GiB of 32.0 GiB (61.81%) in 22s > mirror-scsi0: transferred 20.6 GiB of 32.0 GiB (64.42%) in 23s > mirror-scsi0: transferred 21.5 GiB of 32.0 GiB (67.07%) in 24s > mirror-scsi0: transferred 22.4 GiB of 32.0 GiB (70.02%) in 25s > mirror-scsi0: transferred 23.3 GiB of 32.0 GiB (72.92%) in 26s > mirror-scsi0: transferred 24.3 GiB of 32.0 GiB (75.80%) in 27s > mirror-scsi0: transferred 25.1 GiB of 32.0 GiB (78.58%) in 28s > mirror-scsi0: transferred 26.1 GiB of 32.0 GiB (81.62%) in 29s > mirror-scsi0: transferred 27.1 GiB of 32.0 GiB (84.76%) in 30s > mirror-scsi0: transferred 28.1 GiB of 32.0 GiB (87.73%) in 31s > mirror-scsi0: transferred 29.1 GiB of 32.0 GiB (90.79%) in 32s > mirror-scsi0: transferred 30.0 GiB of 32.0 GiB (93.73%) in 33s > mirror-scsi0: transferred 30.9 GiB of 32.0 GiB (96.55%) in 34s > mirror-scsi0: transferred 31.8 GiB of 32.0 GiB (99.44%) in 35s > mirror-scsi0: transferred 32.0 GiB of 32.0 GiB (100.00%) in 36s, ready > all 'mirror' jobs are ready > mirror-scsi0: Completing block job... > mirror-scsi0: Completed successfully. > mirror-scsi0: mirror-job finished > TrueNAS [INFO] : Ping > TrueNAS [INFO] : Pong > TrueNAS [INFO] : zvol/slow/vm-188-disk-0 with key 'path' found : /dev/zvol/slow/vm-188-disk-0 > TrueNAS [INFO] : Deleted LUN: zvol/slow/vm-188-disk-0 > cannot destroy 'slow/vm-188-disk-0': dataset is busy > TrueNAS [INFO] : /dev/zvol/slow/vm-188-disk-0 > TrueNAS [INFO] : Created LUN: /dev/zvol/slow/vm-188-disk-0 : T5:E237:L3 > command '/usr/bin/ssh -o 'BatchMode=yes' -i /etc/pve/priv/zfs/vm-storage_id_rsa root@vm-storage zfs destroy -r slow/vm-188-disk-0' failed: exit code 1 > TASK OK > [ _______________________________________________ pve-devel mailing list pve-devel@lists.proxmox.com https://lists.proxmox.com/cgi-bin/mailman/listinfo/pve-devel