From mboxrd@z Thu Jan  1 00:00:00 1970
Return-Path: <pve-devel-bounces@lists.proxmox.com>
Received: from firstgate.proxmox.com (firstgate.proxmox.com [212.224.123.68])
	by lore.proxmox.com (Postfix) with ESMTPS id F146D1FF163
	for <inbox@lore.proxmox.com>; Thu, 12 Sep 2024 17:07:12 +0200 (CEST)
Received: from firstgate.proxmox.com (localhost [127.0.0.1])
	by firstgate.proxmox.com (Proxmox) with ESMTP id 12FB335424;
	Thu, 12 Sep 2024 17:07:13 +0200 (CEST)
Date: Thu, 12 Sep 2024 20:55:17 +0900
To: pve-devel@lists.proxmox.com
In-Reply-To: <20240912115047.1252907-1-jing@jing.rocks>
References: <20240912115047.1252907-1-jing@jing.rocks>
X-Mailman-Approved-At: Thu, 12 Sep 2024 17:07:10 +0200
MIME-Version: 1.0
Message-ID: <mailman.230.1726153631.414.pve-devel@lists.proxmox.com>
List-Id: Proxmox VE development discussion <pve-devel.lists.proxmox.com>
List-Post: <mailto:pve-devel@lists.proxmox.com>
From: Jing Luo via pve-devel <pve-devel@lists.proxmox.com>
Precedence: list
Cc: Jing Luo <jing@jing.rocks>
X-Mailman-Version: 2.1.29
X-BeenThere: pve-devel@lists.proxmox.com
List-Subscribe: <https://lists.proxmox.com/cgi-bin/mailman/listinfo/pve-devel>, 
 <mailto:pve-devel-request@lists.proxmox.com?subject=subscribe>
List-Unsubscribe: <https://lists.proxmox.com/cgi-bin/mailman/options/pve-devel>, 
 <mailto:pve-devel-request@lists.proxmox.com?subject=unsubscribe>
List-Archive: <http://lists.proxmox.com/pipermail/pve-devel/>
Reply-To: Proxmox VE development discussion <pve-devel@lists.proxmox.com>
List-Help: <mailto:pve-devel-request@lists.proxmox.com?subject=help>
Subject: Re: [pve-devel] [PATCH pve-manger] test: remove logs and add a
 .gitignore file
Content-Type: multipart/mixed; boundary="===============5462954458956954161=="
Errors-To: pve-devel-bounces@lists.proxmox.com
Sender: "pve-devel" <pve-devel-bounces@lists.proxmox.com>

This is an OpenPGP/MIME signed message (RFC 4880 and 3156)

--===============5462954458956954161==
Content-Type: message/rfc822
Content-Disposition: inline

Return-Path: <jing@jing.rocks>
X-Original-To: pve-devel@lists.proxmox.com
Delivered-To: pve-devel@lists.proxmox.com
Received: from firstgate.proxmox.com (firstgate.proxmox.com [212.224.123.68])
	(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)
	 key-exchange X25519 server-signature RSA-PSS (2048 bits) server-digest SHA256)
	(No client certificate requested)
	by lists.proxmox.com (Postfix) with ESMTPS id 6CC93C255D
	for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 13:55:57 +0200 (CEST)
Received: from firstgate.proxmox.com (localhost [127.0.0.1])
	by firstgate.proxmox.com (Proxmox) with ESMTP id 4A5151FFF4
	for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 13:55:27 +0200 (CEST)
Received: from mail-gw3.jing.rocks (mail-gw3.jing.rocks [219.117.250.209])
	(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)
	 key-exchange X25519 server-signature RSA-PSS (2048 bits))
	(No client certificate requested)
	by firstgate.proxmox.com (Proxmox) with ESMTPS
	for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 13:55:23 +0200 (CEST)
Received: from mail-gw3.jing.rocks (localhost [127.0.0.1])
	by mail-gw3.jing.rocks (Proxmox) with ESMTP id 59DCC27606
	for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 20:55:20 +0900 (JST)
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=jing.rocks; h=cc
	:content-type:content-type:date:from:from:in-reply-to:message-id
	:mime-version:references:reply-to:subject:subject:to:to; s=
	mail-gw; bh=n2B6KPGehubL8XysG40XR76n77aBE4J13HwY1oiZ504=; b=KQ1G
	VySggUSo3IgwsxUkYPT9Zf9LIZ7fLGmVELNfhDNWQauAsNhL5SXcSdXClYJJAbQt
	lv9xslZH/TwqZoaUfoRe0X12sN4S2E2kcJsRMejvJ+oROMiVdimSegbRoI+R/9RO
	C7Ugtp02s/uUj8b0VJkeWiU+BQzZ/LJARTmsVT0mXCGx9X7qlEfXXvmgFPikovXN
	vv8NyNQyLDvwC9bQJVst0XRfwS90B0RdTe71IWIZEoSweDA92sL+solnMC306Z/R
	U5DUdML+HB8p+NmoKOET9WidIoTIlxwlwH4B6X00JDgNfVfz6CIBltjbDk7hbA1p
	EmJIviBZaWXYiNXa4A==
Received: from mail.jing.rocks (mail.jing.rocks [IPv6:240b:10:f00:1b00::222])
	(using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits)
	 key-exchange X25519 server-signature RSA-PSS (4096 bits) server-digest SHA256)
	(No client certificate requested)
	by mail-gw3.jing.rocks (Proxmox) with ESMTPS id 1ACB127477
	for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 20:55:18 +0900 (JST)
DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=jing.rocks;
	s=default; t=1726142118;
	bh=PA5GsvsJTTIxmvjv/ilIudfZJdGLNJjvKLa9YuC8vNw=;
	h=Date:From:To:Subject:In-Reply-To:References:From;
	b=QRDwvYIeboNqOezJWcJThyG8H1xHLd/G/f00luyzGLIvB+pbCj6Up83AcuTzIfnK4
	 dcLuAbgbcC87EbrDym2Atdo0E41auKXxFs14WbT8NmF5lM9vxSAoco66EbB1XbIh3S
	 dtqxIi5lRZ6Y9dx1vZ6E/T3b5i4Qw0WmszbiQOp1JeLvesTo/fooXyKkIIrk7x97C4
	 SjLnTX22rpm1nCXW22M82a0krifGkr7sPDfOjm1oLAo9zwG6K7dcoe2L2DtQgfQ0e7
	 x8oomw3l/rrvOgKdwt42r+woPEtJoXM88VynlYn+XnIkw10kbgslIKduP3mjp/ONY1
	 XxAtivycTjtUw==
Received: from mail.jing.rocks (localhost [127.0.0.1])
	(Authenticated sender: jing@jing.rocks)
	by mail.jing.rocks (Postfix) with ESMTPSA id 0266A540DF
	for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 20:55:17 +0900 (JST)
MIME-Version: 1.0
Date: Thu, 12 Sep 2024 20:55:17 +0900
From: Jing Luo <jing@jing.rocks>
To: pve-devel@lists.proxmox.com
Subject: Re: [PATCH pve-manger] test: remove logs and add a .gitignore file
In-Reply-To: <20240912115047.1252907-1-jing@jing.rocks>
References: <20240912115047.1252907-1-jing@jing.rocks>
Message-ID: <5169c930387cfb176ed46149baf8b3f9@jing.rocks>
X-Sender: jing@jing.rocks
Content-Type: multipart/signed;
 protocol="application/pgp-signature";
 boundary="=_e1414c1e4b8865511c6f4f1690d2d7e1";
 micalg=pgp-sha512
X-SPAM-LEVEL: Spam detection results:  0
	AWL                    -0.007 Adjusted score from AWL reputation of From: address
	BAYES_00                 -1.9 Bayes spam probability is 0 to 1%
	DKIM_SIGNED               0.1 Message has a DKIM or DK signature, not necessarily valid
	DKIM_VALID               -0.1 Message has at least one valid DKIM or DK signature
	DKIM_VALID_AU            -0.1 Message has a valid DKIM or DK signature from author's domain
	DKIM_VALID_EF            -0.1 Message has a valid DKIM or DK signature from envelope-from domain
	DMARC_PASS               -0.1 DMARC pass policy
	KAM_INFOUSMEBIZ          0.75 Prevalent use of .info|.us|.me|.me.uk|.biz|xyz|id|rocks|life domains in spam/malware
	KAM_OTHER_BAD_TLD        0.75 Other untrustworthy TLDs
	RCVD_IN_VALIDITY_CERTIFIED_BLOCKED  0.001 ADMINISTRATOR NOTICE: The query to Validity was blocked.  See https://knowledge.validity.com/hc/en-us/articles/20961730681243 for more information.
	RCVD_IN_VALIDITY_RPBL_BLOCKED  0.001 ADMINISTRATOR NOTICE: The query to Validity was blocked.  See https://knowledge.validity.com/hc/en-us/articles/20961730681243 for more information.
	RCVD_IN_VALIDITY_SAFE_BLOCKED  0.001 ADMINISTRATOR NOTICE: The query to Validity was blocked.  See https://knowledge.validity.com/hc/en-us/articles/20961730681243 for more information.
	SPF_HELO_NONE           0.001 SPF: HELO does not publish an SPF Record
	SPF_PASS               -0.001 SPF: sender matches SPF record
X-Mailman-Approved-At: Thu, 12 Sep 2024 17:07:10 +0200

This is an OpenPGP/MIME signed message (RFC 4880 and 3156)

--=_e1414c1e4b8865511c6f4f1690d2d7e1
Content-Transfer-Encoding: 7bit
Content-Type: text/plain; charset=US-ASCII;
 format=flowed

Oops sorry. This is for pve-manger.

On 2024-09-12 20:49, Jing Luo wrote:
> Through out the years there are 3 log files committed to the git repo. 
> Let's
> remove those and add a .gitignore file.
> 
> Signed-off-by: Jing Luo <jing@jing.rocks>
> ---
>  test/.gitignore            |  1 +
>  test/replication_test4.log | 25 ---------------
>  test/replication_test5.log | 64 --------------------------------------
>  test/replication_test6.log |  8 -----
>  4 files changed, 1 insertion(+), 97 deletions(-)
>  create mode 100644 test/.gitignore
>  delete mode 100644 test/replication_test4.log
>  delete mode 100644 test/replication_test5.log
>  delete mode 100644 test/replication_test6.log
> 
> diff --git a/test/.gitignore b/test/.gitignore
> new file mode 100644
> index 00000000..397b4a76
> --- /dev/null
> +++ b/test/.gitignore
> @@ -0,0 +1 @@
> +*.log
> diff --git a/test/replication_test4.log b/test/replication_test4.log
> deleted file mode 100644
> index caefa0de..00000000
> --- a/test/replication_test4.log
> +++ /dev/null
> @@ -1,25 +0,0 @@
> -1000 job_900_to_node2: new job next_sync => 900
> -1000 job_900_to_node2: start replication job
> -1000 job_900_to_node2: end replication job with error: faked 
> replication error
> -1000 job_900_to_node2: changed config next_sync => 1300
> -1000 job_900_to_node2: changed state last_node => node1, last_try => 
> 1000, fail_count => 1, error => faked replication error
> -1300 job_900_to_node2: start replication job
> -1300 job_900_to_node2: end replication job with error: faked 
> replication error
> -1300 job_900_to_node2: changed config next_sync => 1900
> -1300 job_900_to_node2: changed state last_try => 1300, fail_count => 2
> -1900 job_900_to_node2: start replication job
> -1900 job_900_to_node2: end replication job with error: faked 
> replication error
> -1900 job_900_to_node2: changed config next_sync => 2800
> -1900 job_900_to_node2: changed state last_try => 1900, fail_count => 3
> -2800 job_900_to_node2: start replication job
> -2800 job_900_to_node2: end replication job with error: faked 
> replication error
> -2800 job_900_to_node2: changed config next_sync => 4600
> -2800 job_900_to_node2: changed state last_try => 2800, fail_count => 4
> -4600 job_900_to_node2: start replication job
> -4600 job_900_to_node2: end replication job with error: faked 
> replication error
> -4600 job_900_to_node2: changed config next_sync => 6400
> -4600 job_900_to_node2: changed state last_try => 4600, fail_count => 5
> -6400 job_900_to_node2: start replication job
> -6400 job_900_to_node2: end replication job with error: faked 
> replication error
> -6400 job_900_to_node2: changed config next_sync => 8200
> -6400 job_900_to_node2: changed state last_try => 6400, fail_count => 6
> diff --git a/test/replication_test5.log b/test/replication_test5.log
> deleted file mode 100644
> index 928feca3..00000000
> --- a/test/replication_test5.log
> +++ /dev/null
> @@ -1,64 +0,0 @@
> -1000 job_900_to_node2: new job next_sync => 900
> -1000 job_900_to_node2: start replication job
> -1000 job_900_to_node2: guest => VM 900, running => 0
> -1000 job_900_to_node2: volumes => local-zfs:vm-900-disk-1
> -1000 job_900_to_node2: create snapshot 
> '__replicate_job_900_to_node2_1000__' on local-zfs:vm-900-disk-1
> -1000 job_900_to_node2: using secure transmission, rate limit: none
> -1000 job_900_to_node2: full sync 'local-zfs:vm-900-disk-1' 
> (__replicate_job_900_to_node2_1000__)
> -1000 job_900_to_node2: end replication job
> -1000 job_900_to_node2: changed config next_sync => 1800
> -1000 job_900_to_node2: changed state last_node => node1, last_try => 
> 1000, last_sync => 1000
> -1000 job_900_to_node2: changed storeid list local-zfs
> -1840 job_900_to_node2: start replication job
> -1840 job_900_to_node2: guest => VM 900, running => 0
> -1840 job_900_to_node2: volumes => local-zfs:vm-900-disk-1
> -1840 job_900_to_node2: create snapshot 
> '__replicate_job_900_to_node2_1840__' on local-zfs:vm-900-disk-1
> -1840 job_900_to_node2: using secure transmission, rate limit: none
> -1840 job_900_to_node2: incremental sync 'local-zfs:vm-900-disk-1' 
> (__replicate_job_900_to_node2_1000__ => 
> __replicate_job_900_to_node2_1840__)
> -1840 job_900_to_node2: delete previous replication snapshot 
> '__replicate_job_900_to_node2_1000__' on local-zfs:vm-900-disk-1
> -1840 job_900_to_node2: end replication job
> -1840 job_900_to_node2: changed config next_sync => 2700
> -1840 job_900_to_node2: changed state last_try => 1840, last_sync => 
> 1840
> -2740 job_900_to_node2: start replication job
> -2740 job_900_to_node2: guest => VM 900, running => 0
> -2740 job_900_to_node2: volumes => 
> local-zfs:vm-900-disk-1,local-zfs:vm-900-disk-2
> -2740 job_900_to_node2: create snapshot 
> '__replicate_job_900_to_node2_2740__' on local-zfs:vm-900-disk-1
> -2740 job_900_to_node2: create snapshot 
> '__replicate_job_900_to_node2_2740__' on local-zfs:vm-900-disk-2
> -2740 job_900_to_node2: delete previous replication snapshot 
> '__replicate_job_900_to_node2_2740__' on local-zfs:vm-900-disk-1
> -2740 job_900_to_node2: end replication job with error: no such volid 
> 'local-zfs:vm-900-disk-2'
> -2740 job_900_to_node2: changed config next_sync => 3040
> -2740 job_900_to_node2: changed state last_try => 2740, fail_count => 
> 1, error => no such volid 'local-zfs:vm-900-disk-2'
> -3040 job_900_to_node2: start replication job
> -3040 job_900_to_node2: guest => VM 900, running => 0
> -3040 job_900_to_node2: volumes => 
> local-zfs:vm-900-disk-1,local-zfs:vm-900-disk-2
> -3040 job_900_to_node2: create snapshot 
> '__replicate_job_900_to_node2_3040__' on local-zfs:vm-900-disk-1
> -3040 job_900_to_node2: create snapshot 
> '__replicate_job_900_to_node2_3040__' on local-zfs:vm-900-disk-2
> -3040 job_900_to_node2: using secure transmission, rate limit: none
> -3040 job_900_to_node2: incremental sync 'local-zfs:vm-900-disk-1' 
> (__replicate_job_900_to_node2_1840__ => 
> __replicate_job_900_to_node2_3040__)
> -3040 job_900_to_node2: full sync 'local-zfs:vm-900-disk-2' 
> (__replicate_job_900_to_node2_3040__)
> -3040 job_900_to_node2: delete previous replication snapshot 
> '__replicate_job_900_to_node2_1840__' on local-zfs:vm-900-disk-1
> -3040 job_900_to_node2: end replication job
> -3040 job_900_to_node2: changed config next_sync => 3600
> -3040 job_900_to_node2: changed state last_try => 3040, last_sync => 
> 3040, fail_count => 0, error =>
> -3640 job_900_to_node2: start replication job
> -3640 job_900_to_node2: guest => VM 900, running => 0
> -3640 job_900_to_node2: volumes => 
> local-zfs:vm-900-disk-1,local-zfs:vm-900-disk-2
> -3640 job_900_to_node2: create snapshot 
> '__replicate_job_900_to_node2_3640__' on local-zfs:vm-900-disk-1
> -3640 job_900_to_node2: create snapshot 
> '__replicate_job_900_to_node2_3640__' on local-zfs:vm-900-disk-2
> -3640 job_900_to_node2: using secure transmission, rate limit: none
> -3640 job_900_to_node2: incremental sync 'local-zfs:vm-900-disk-1' 
> (__replicate_job_900_to_node2_3040__ => 
> __replicate_job_900_to_node2_3640__)
> -3640 job_900_to_node2: incremental sync 'local-zfs:vm-900-disk-2' 
> (__replicate_job_900_to_node2_3040__ => 
> __replicate_job_900_to_node2_3640__)
> -3640 job_900_to_node2: delete previous replication snapshot 
> '__replicate_job_900_to_node2_3040__' on local-zfs:vm-900-disk-1
> -3640 job_900_to_node2: delete previous replication snapshot 
> '__replicate_job_900_to_node2_3040__' on local-zfs:vm-900-disk-2
> -3640 job_900_to_node2: end replication job
> -3640 job_900_to_node2: changed config next_sync => 4500
> -3640 job_900_to_node2: changed state last_try => 3640, last_sync => 
> 3640
> -3700 job_900_to_node2: start replication job
> -3700 job_900_to_node2: guest => VM 900, running => 0
> -3700 job_900_to_node2: volumes => 
> local-zfs:vm-900-disk-1,local-zfs:vm-900-disk-2
> -3700 job_900_to_node2: start job removal - mode 'full'
> -3700 job_900_to_node2: delete stale replication snapshot 
> '__replicate_job_900_to_node2_3640__' on local-zfs:vm-900-disk-1
> -3700 job_900_to_node2: delete stale replication snapshot 
> '__replicate_job_900_to_node2_3640__' on local-zfs:vm-900-disk-2
> -3700 job_900_to_node2: job removed
> -3700 job_900_to_node2: end replication job
> -3700 job_900_to_node2: vanished job
> diff --git a/test/replication_test6.log b/test/replication_test6.log
> deleted file mode 100644
> index 91754544..00000000
> --- a/test/replication_test6.log
> +++ /dev/null
> @@ -1,8 +0,0 @@
> -1000 job_900_to_node1: new job next_sync => 1
> -1000 job_900_to_node1: start replication job
> -1000 job_900_to_node1: guest => VM 900, running => 0
> -1000 job_900_to_node1: volumes => local-zfs:vm-900-disk-1
> -1000 job_900_to_node1: start job removal - mode 'full'
> -1000 job_900_to_node1: job removed
> -1000 job_900_to_node1: end replication job
> -1000 job_900_to_node1: vanished job

-- 
Jing Luo
About me: https://jing.rocks/about/
GPG Fingerprint: 4E09 8D19 00AA 3F72 1899 2614 09B3 316E 13A1 1EFC

--=_e1414c1e4b8865511c6f4f1690d2d7e1
Content-Type: application/pgp-signature;
 name=signature.asc
Content-Disposition: attachment;
 filename=signature.asc;
 size=228
Content-Description: OpenPGP digital signature

-----BEGIN PGP SIGNATURE-----

iHUEARYKAB0WIQQUNK5y7dM5LGmlOjiPRdGe/wwPKwUCZuLWpQAKCRCPRdGe/wwP
K3kKAP9EMFBtKSKqkLyKN8Jz5EztaU65Tk5UE1zfMILqVI8WWAEA1QeirHtPywpN
gCF8XP66zXA7oCTSP8X3ahfkkU27PAc=
=wwv4
-----END PGP SIGNATURE-----

--=_e1414c1e4b8865511c6f4f1690d2d7e1--



--===============5462954458956954161==
Content-Type: text/plain; charset="us-ascii"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Content-Disposition: inline

_______________________________________________
pve-devel mailing list
pve-devel@lists.proxmox.com
https://lists.proxmox.com/cgi-bin/mailman/listinfo/pve-devel

--===============5462954458956954161==--