From mboxrd@z Thu Jan  1 00:00:00 1970
Return-Path: <f.ebner@proxmox.com>
Received: from firstgate.proxmox.com (firstgate.proxmox.com [212.224.123.68])
 (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)
 key-exchange X25519 server-signature RSA-PSS (2048 bits))
 (No client certificate requested)
 by lists.proxmox.com (Postfix) with ESMTPS id B139FBA09A
 for <pve-devel@lists.proxmox.com>; Wed, 13 Dec 2023 15:18:21 +0100 (CET)
Received: from firstgate.proxmox.com (localhost [127.0.0.1])
 by firstgate.proxmox.com (Proxmox) with ESMTP id 92B42B1B0
 for <pve-devel@lists.proxmox.com>; Wed, 13 Dec 2023 15:17:51 +0100 (CET)
Received: from proxmox-new.maurer-it.com (proxmox-new.maurer-it.com
 [94.136.29.106])
 (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)
 key-exchange X25519 server-signature RSA-PSS (2048 bits))
 (No client certificate requested)
 by firstgate.proxmox.com (Proxmox) with ESMTPS
 for <pve-devel@lists.proxmox.com>; Wed, 13 Dec 2023 15:17:50 +0100 (CET)
Received: from proxmox-new.maurer-it.com (localhost.localdomain [127.0.0.1])
 by proxmox-new.maurer-it.com (Proxmox) with ESMTP id 4A38E47126
 for <pve-devel@lists.proxmox.com>; Wed, 13 Dec 2023 15:17:50 +0100 (CET)
From: Fiona Ebner <f.ebner@proxmox.com>
To: pve-devel@lists.proxmox.com
Date: Wed, 13 Dec 2023 15:17:46 +0100
Message-Id: <20231213141747.679613-3-f.ebner@proxmox.com>
X-Mailer: git-send-email 2.39.2
In-Reply-To: <20231213141747.679613-1-f.ebner@proxmox.com>
References: <20231213141747.679613-1-f.ebner@proxmox.com>
MIME-Version: 1.0
Content-Transfer-Encoding: 8bit
X-SPAM-LEVEL: Spam detection results:  0
 AWL -0.077 Adjusted score from AWL reputation of From: address
 BAYES_00                 -1.9 Bayes spam probability is 0 to 1%
 DMARC_MISSING             0.1 Missing DMARC policy
 KAM_DMARC_STATUS 0.01 Test Rule for DKIM or SPF Failure with Strict Alignment
 SPF_HELO_NONE           0.001 SPF: HELO does not publish an SPF Record
 SPF_PASS               -0.001 SPF: sender matches SPF record
 T_SCC_BODY_TEXT_LINE    -0.01 -
 URIBL_BLOCKED 0.001 ADMINISTRATOR NOTICE: The query to URIBL was blocked. See
 http://wiki.apache.org/spamassassin/DnsBlocklists#dnsbl-block for more
 information. [replication.pm]
Subject: [pve-devel] [PATCH v2 guest-common 2/3] replication: find common
 base: improve error when no common base snapshot exists
X-BeenThere: pve-devel@lists.proxmox.com
X-Mailman-Version: 2.1.29
Precedence: list
List-Id: Proxmox VE development discussion <pve-devel.lists.proxmox.com>
List-Unsubscribe: <https://lists.proxmox.com/cgi-bin/mailman/options/pve-devel>, 
 <mailto:pve-devel-request@lists.proxmox.com?subject=unsubscribe>
List-Archive: <http://lists.proxmox.com/pipermail/pve-devel/>
List-Post: <mailto:pve-devel@lists.proxmox.com>
List-Help: <mailto:pve-devel-request@lists.proxmox.com?subject=help>
List-Subscribe: <https://lists.proxmox.com/cgi-bin/mailman/listinfo/pve-devel>, 
 <mailto:pve-devel-request@lists.proxmox.com?subject=subscribe>
X-List-Received-Date: Wed, 13 Dec 2023 14:18:21 -0000

Suggest an alternative solution by removing the problematic volumes
from the replication target rather than the whole job.

This is helpful if there are multiple replicated volumes to avoid the
need to fully re-sync all volumes in many cases.

Signed-off-by: Fiona Ebner <f.ebner@proxmox.com>
---

No changes in v2.

 src/PVE/Replication.pm | 15 ++++++++++-----
 1 file changed, 10 insertions(+), 5 deletions(-)

diff --git a/src/PVE/Replication.pm b/src/PVE/Replication.pm
index 05c2632..984ea34 100644
--- a/src/PVE/Replication.pm
+++ b/src/PVE/Replication.pm
@@ -53,6 +53,7 @@ sub find_common_replication_snapshot {
     );
 
     my $base_snapshots = {};
+    my @missing_snapshots = ();
 
     foreach my $volid (@$volumes) {
 	my $local_info = $local_snapshots->{$volid};
@@ -91,15 +92,19 @@ sub find_common_replication_snapshot {
 		    next;
 		}
 
-		# The volume exists on the remote side, so trying a full sync won't work.
-		# Die early with a clean error.
-		die "No common base to restore the job state\n".
-		    "please delete jobid: $jobid and create the job again\n"
-		    if !defined($base_snapshots->{$volid});
+		push @missing_snapshots, $volid if !defined($base_snapshots->{$volid});
 	    }
 	}
     }
 
+    if (scalar(@missing_snapshots) > 0) {
+	# There exist volumes without common base snapshot on the remote side.
+	# Trying to (do a full) sync won't work, so die early with a clean error.
+	my $volume_string = join(',', @missing_snapshots);
+	die "No common base snapshot on volume(s) $volume_string\nPlease remove the problematic " .
+	    "volume(s) from the replication target or delete and re-create the whole job $jobid\n";
+    }
+
     return ($base_snapshots, $local_snapshots, $last_sync_snapname);
 }
 
-- 
2.39.2