From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: <pve-devel-bounces@lists.proxmox.com> Received: from firstgate.proxmox.com (firstgate.proxmox.com [212.224.123.68]) by lore.proxmox.com (Postfix) with ESMTPS id 3467C1FF163 for <inbox@lore.proxmox.com>; Thu, 12 Sep 2024 17:07:18 +0200 (CEST) Received: from firstgate.proxmox.com (localhost [127.0.0.1]) by firstgate.proxmox.com (Proxmox) with ESMTP id A992C3547D; Thu, 12 Sep 2024 17:07:13 +0200 (CEST) To: pve-devel@lists.proxmox.com Date: Thu, 12 Sep 2024 20:49:59 +0900 X-Mailman-Approved-At: Thu, 12 Sep 2024 17:07:10 +0200 MIME-Version: 1.0 Message-ID: <mailman.233.1726153631.414.pve-devel@lists.proxmox.com> List-Id: Proxmox VE development discussion <pve-devel.lists.proxmox.com> List-Post: <mailto:pve-devel@lists.proxmox.com> From: Jing Luo via pve-devel <pve-devel@lists.proxmox.com> Precedence: list Cc: Jing Luo <jing@jing.rocks> X-Mailman-Version: 2.1.29 X-BeenThere: pve-devel@lists.proxmox.com List-Subscribe: <https://lists.proxmox.com/cgi-bin/mailman/listinfo/pve-devel>, <mailto:pve-devel-request@lists.proxmox.com?subject=subscribe> List-Unsubscribe: <https://lists.proxmox.com/cgi-bin/mailman/options/pve-devel>, <mailto:pve-devel-request@lists.proxmox.com?subject=unsubscribe> List-Archive: <http://lists.proxmox.com/pipermail/pve-devel/> Reply-To: Proxmox VE development discussion <pve-devel@lists.proxmox.com> List-Help: <mailto:pve-devel-request@lists.proxmox.com?subject=help> Subject: [pve-devel] [PATCH] test: remove logs and add a .gitignore file Content-Type: multipart/mixed; boundary="===============0026061162758263294==" Errors-To: pve-devel-bounces@lists.proxmox.com Sender: "pve-devel" <pve-devel-bounces@lists.proxmox.com> --===============0026061162758263294== Content-Type: message/rfc822 Content-Disposition: inline Return-Path: <jing@jing.rocks> X-Original-To: pve-devel@lists.proxmox.com Delivered-To: pve-devel@lists.proxmox.com Received: from firstgate.proxmox.com (firstgate.proxmox.com [212.224.123.68]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (2048 bits)) (No client certificate requested) by lists.proxmox.com (Postfix) with ESMTPS id 0E9F7C25B7 for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 13:58:55 +0200 (CEST) Received: from firstgate.proxmox.com (localhost [127.0.0.1]) by firstgate.proxmox.com (Proxmox) with ESMTP id DD4843003E for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 13:58:24 +0200 (CEST) Received: from mail-gw3.jing.rocks (mail-gw3.jing.rocks [219.117.250.209]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (2048 bits)) (No client certificate requested) by firstgate.proxmox.com (Proxmox) with ESMTPS for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 13:58:23 +0200 (CEST) Received: from mail-gw3.jing.rocks (localhost [127.0.0.1]) by mail-gw3.jing.rocks (Proxmox) with ESMTP id 233E927474 for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 20:51:25 +0900 (JST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=jing.rocks; h=cc :cc:content-transfer-encoding:date:from:from:message-id :mime-version:reply-to:subject:subject:to:to; s=mail-gw; bh=PvyS HV2SgWeXRF47tL+wa+xaa2ukObIn8qZat0QCcr0=; b=i89KQolviXKE4yELPrms 99cZY1NDrGQVKkRnwwgdOUu8+LoNvwM0/BHD9RRQfOXPheclRqPimFRcrzXGhEJK N6oxyLMDfum763WDdFG4uv7xUqb7fsbfgULUTprQcpXkKfh/Rvf3M5//+QDe8Igp O3pt+um/jP0pjkko6XK0j8OSpKYJ6S79h94MGJQQmuHKAA8RCBXC9UuSN5gltreh 1LqK5McaBU4Y9pGrbr1FsV0jd3VHT2AGV1geCjPEqD1zCgAWGlrChEJu27DeGdxM 7Oq2ovQBsg4GPg9ln0Mas50yHVlYGK623D4+eBoxJ+DzOjFaDiC5s5jHvZmE+yEC Sw== Received: from mail.jing.rocks (mail.jing.rocks [192.168.0.222]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (4096 bits) server-digest SHA256) (No client certificate requested) by mail-gw3.jing.rocks (Proxmox) with ESMTPS id DDC60274AC for <pve-devel@lists.proxmox.com>; Thu, 12 Sep 2024 20:51:23 +0900 (JST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=jing.rocks; s=default; t=1726141883; bh=mooCFtW8/TtGogPslgeb/eczZZuQ/84DeszXx9dqbxo=; h=From:To:Cc:Subject:Date:From; b=g/qg39OVjkwfgDahpZEHhhOulLy5aB1D6TxnEq144zHN2FzNdZ5/d0ITDy4LfhXxu h/QbiDvPsTS9ip1jABDjaloRr+F6LJjb5LNZ9bQt9QYROqz7x/aAM3MjrvAHOMRrTU ATGcgj4n69L5cMQp4TqTqV4qMfp3NajqfVlrivCb7XQyE3W71kN6CqsRvtj17P+LUX 0L1q8Kk98gwFDZsXQj3wnuMXD+75P12l685ZbH5OCE1E3YPYddgmayxbwvl7jKe7ii tW20xdGHtr3elJqoSy8TR8TiSEkCOMWgE2IunkDIZTaQ67MSGknCwNI5ouVCs6g9go g2tvBNcxxdbKQ== Received: from X570AM.jing.rocks (X570AM.jing.rocks [IPv6:240b:10:f00:1b00::7e82]) (Authenticated sender: jing@jing.rocks) by mail.jing.rocks (Postfix) with ESMTPSA id C24C15419B; Thu, 12 Sep 2024 20:51:23 +0900 (JST) From: Jing Luo <jing@jing.rocks> To: pve-devel@lists.proxmox.com Cc: Jing Luo <jing@jing.rocks> Subject: [PATCH] test: remove logs and add a .gitignore file Date: Thu, 12 Sep 2024 20:49:59 +0900 Message-ID: <20240912115047.1252907-1-jing@jing.rocks> X-Mailer: git-send-email 2.46.0 MIME-Version: 1.0 Content-Transfer-Encoding: 8bit X-SPAM-LEVEL: Spam detection results: 0 AWL -0.003 Adjusted score from AWL reputation of From: address BAYES_00 -1.9 Bayes spam probability is 0 to 1% DKIM_SIGNED 0.1 Message has a DKIM or DK signature, not necessarily valid DKIM_VALID -0.1 Message has at least one valid DKIM or DK signature DKIM_VALID_AU -0.1 Message has a valid DKIM or DK signature from author's domain DKIM_VALID_EF -0.1 Message has a valid DKIM or DK signature from envelope-from domain DMARC_PASS -0.1 DMARC pass policy KAM_INFOUSMEBIZ 0.75 Prevalent use of .info|.us|.me|.me.uk|.biz|xyz|id|rocks|life domains in spam/malware KAM_OTHER_BAD_TLD 0.75 Other untrustworthy TLDs SPF_HELO_NONE 0.001 SPF: HELO does not publish an SPF Record SPF_PASS -0.001 SPF: sender matches SPF record X-Mailman-Approved-At: Thu, 12 Sep 2024 17:07:10 +0200 Through out the years there are 3 log files committed to the git repo. Let's remove those and add a .gitignore file. Signed-off-by: Jing Luo <jing@jing.rocks> --- test/.gitignore | 1 + test/replication_test4.log | 25 --------------- test/replication_test5.log | 64 -------------------------------------- test/replication_test6.log | 8 ----- 4 files changed, 1 insertion(+), 97 deletions(-) create mode 100644 test/.gitignore delete mode 100644 test/replication_test4.log delete mode 100644 test/replication_test5.log delete mode 100644 test/replication_test6.log diff --git a/test/.gitignore b/test/.gitignore new file mode 100644 index 00000000..397b4a76 --- /dev/null +++ b/test/.gitignore @@ -0,0 +1 @@ +*.log diff --git a/test/replication_test4.log b/test/replication_test4.log deleted file mode 100644 index caefa0de..00000000 --- a/test/replication_test4.log +++ /dev/null @@ -1,25 +0,0 @@ -1000 job_900_to_node2: new job next_sync => 900 -1000 job_900_to_node2: start replication job -1000 job_900_to_node2: end replication job with error: faked replication error -1000 job_900_to_node2: changed config next_sync => 1300 -1000 job_900_to_node2: changed state last_node => node1, last_try => 1000, fail_count => 1, error => faked replication error -1300 job_900_to_node2: start replication job -1300 job_900_to_node2: end replication job with error: faked replication error -1300 job_900_to_node2: changed config next_sync => 1900 -1300 job_900_to_node2: changed state last_try => 1300, fail_count => 2 -1900 job_900_to_node2: start replication job -1900 job_900_to_node2: end replication job with error: faked replication error -1900 job_900_to_node2: changed config next_sync => 2800 -1900 job_900_to_node2: changed state last_try => 1900, fail_count => 3 -2800 job_900_to_node2: start replication job -2800 job_900_to_node2: end replication job with error: faked replication error -2800 job_900_to_node2: changed config next_sync => 4600 -2800 job_900_to_node2: changed state last_try => 2800, fail_count => 4 -4600 job_900_to_node2: start replication job -4600 job_900_to_node2: end replication job with error: faked replication error -4600 job_900_to_node2: changed config next_sync => 6400 -4600 job_900_to_node2: changed state last_try => 4600, fail_count => 5 -6400 job_900_to_node2: start replication job -6400 job_900_to_node2: end replication job with error: faked replication error -6400 job_900_to_node2: changed config next_sync => 8200 -6400 job_900_to_node2: changed state last_try => 6400, fail_count => 6 diff --git a/test/replication_test5.log b/test/replication_test5.log deleted file mode 100644 index 928feca3..00000000 --- a/test/replication_test5.log +++ /dev/null @@ -1,64 +0,0 @@ -1000 job_900_to_node2: new job next_sync => 900 -1000 job_900_to_node2: start replication job -1000 job_900_to_node2: guest => VM 900, running => 0 -1000 job_900_to_node2: volumes => local-zfs:vm-900-disk-1 -1000 job_900_to_node2: create snapshot '__replicate_job_900_to_node2_1000__' on local-zfs:vm-900-disk-1 -1000 job_900_to_node2: using secure transmission, rate limit: none -1000 job_900_to_node2: full sync 'local-zfs:vm-900-disk-1' (__replicate_job_900_to_node2_1000__) -1000 job_900_to_node2: end replication job -1000 job_900_to_node2: changed config next_sync => 1800 -1000 job_900_to_node2: changed state last_node => node1, last_try => 1000, last_sync => 1000 -1000 job_900_to_node2: changed storeid list local-zfs -1840 job_900_to_node2: start replication job -1840 job_900_to_node2: guest => VM 900, running => 0 -1840 job_900_to_node2: volumes => local-zfs:vm-900-disk-1 -1840 job_900_to_node2: create snapshot '__replicate_job_900_to_node2_1840__' on local-zfs:vm-900-disk-1 -1840 job_900_to_node2: using secure transmission, rate limit: none -1840 job_900_to_node2: incremental sync 'local-zfs:vm-900-disk-1' (__replicate_job_900_to_node2_1000__ => __replicate_job_900_to_node2_1840__) -1840 job_900_to_node2: delete previous replication snapshot '__replicate_job_900_to_node2_1000__' on local-zfs:vm-900-disk-1 -1840 job_900_to_node2: end replication job -1840 job_900_to_node2: changed config next_sync => 2700 -1840 job_900_to_node2: changed state last_try => 1840, last_sync => 1840 -2740 job_900_to_node2: start replication job -2740 job_900_to_node2: guest => VM 900, running => 0 -2740 job_900_to_node2: volumes => local-zfs:vm-900-disk-1,local-zfs:vm-900-disk-2 -2740 job_900_to_node2: create snapshot '__replicate_job_900_to_node2_2740__' on local-zfs:vm-900-disk-1 -2740 job_900_to_node2: create snapshot '__replicate_job_900_to_node2_2740__' on local-zfs:vm-900-disk-2 -2740 job_900_to_node2: delete previous replication snapshot '__replicate_job_900_to_node2_2740__' on local-zfs:vm-900-disk-1 -2740 job_900_to_node2: end replication job with error: no such volid 'local-zfs:vm-900-disk-2' -2740 job_900_to_node2: changed config next_sync => 3040 -2740 job_900_to_node2: changed state last_try => 2740, fail_count => 1, error => no such volid 'local-zfs:vm-900-disk-2' -3040 job_900_to_node2: start replication job -3040 job_900_to_node2: guest => VM 900, running => 0 -3040 job_900_to_node2: volumes => local-zfs:vm-900-disk-1,local-zfs:vm-900-disk-2 -3040 job_900_to_node2: create snapshot '__replicate_job_900_to_node2_3040__' on local-zfs:vm-900-disk-1 -3040 job_900_to_node2: create snapshot '__replicate_job_900_to_node2_3040__' on local-zfs:vm-900-disk-2 -3040 job_900_to_node2: using secure transmission, rate limit: none -3040 job_900_to_node2: incremental sync 'local-zfs:vm-900-disk-1' (__replicate_job_900_to_node2_1840__ => __replicate_job_900_to_node2_3040__) -3040 job_900_to_node2: full sync 'local-zfs:vm-900-disk-2' (__replicate_job_900_to_node2_3040__) -3040 job_900_to_node2: delete previous replication snapshot '__replicate_job_900_to_node2_1840__' on local-zfs:vm-900-disk-1 -3040 job_900_to_node2: end replication job -3040 job_900_to_node2: changed config next_sync => 3600 -3040 job_900_to_node2: changed state last_try => 3040, last_sync => 3040, fail_count => 0, error => -3640 job_900_to_node2: start replication job -3640 job_900_to_node2: guest => VM 900, running => 0 -3640 job_900_to_node2: volumes => local-zfs:vm-900-disk-1,local-zfs:vm-900-disk-2 -3640 job_900_to_node2: create snapshot '__replicate_job_900_to_node2_3640__' on local-zfs:vm-900-disk-1 -3640 job_900_to_node2: create snapshot '__replicate_job_900_to_node2_3640__' on local-zfs:vm-900-disk-2 -3640 job_900_to_node2: using secure transmission, rate limit: none -3640 job_900_to_node2: incremental sync 'local-zfs:vm-900-disk-1' (__replicate_job_900_to_node2_3040__ => __replicate_job_900_to_node2_3640__) -3640 job_900_to_node2: incremental sync 'local-zfs:vm-900-disk-2' (__replicate_job_900_to_node2_3040__ => __replicate_job_900_to_node2_3640__) -3640 job_900_to_node2: delete previous replication snapshot '__replicate_job_900_to_node2_3040__' on local-zfs:vm-900-disk-1 -3640 job_900_to_node2: delete previous replication snapshot '__replicate_job_900_to_node2_3040__' on local-zfs:vm-900-disk-2 -3640 job_900_to_node2: end replication job -3640 job_900_to_node2: changed config next_sync => 4500 -3640 job_900_to_node2: changed state last_try => 3640, last_sync => 3640 -3700 job_900_to_node2: start replication job -3700 job_900_to_node2: guest => VM 900, running => 0 -3700 job_900_to_node2: volumes => local-zfs:vm-900-disk-1,local-zfs:vm-900-disk-2 -3700 job_900_to_node2: start job removal - mode 'full' -3700 job_900_to_node2: delete stale replication snapshot '__replicate_job_900_to_node2_3640__' on local-zfs:vm-900-disk-1 -3700 job_900_to_node2: delete stale replication snapshot '__replicate_job_900_to_node2_3640__' on local-zfs:vm-900-disk-2 -3700 job_900_to_node2: job removed -3700 job_900_to_node2: end replication job -3700 job_900_to_node2: vanished job diff --git a/test/replication_test6.log b/test/replication_test6.log deleted file mode 100644 index 91754544..00000000 --- a/test/replication_test6.log +++ /dev/null @@ -1,8 +0,0 @@ -1000 job_900_to_node1: new job next_sync => 1 -1000 job_900_to_node1: start replication job -1000 job_900_to_node1: guest => VM 900, running => 0 -1000 job_900_to_node1: volumes => local-zfs:vm-900-disk-1 -1000 job_900_to_node1: start job removal - mode 'full' -1000 job_900_to_node1: job removed -1000 job_900_to_node1: end replication job -1000 job_900_to_node1: vanished job -- 2.46.0 --===============0026061162758263294== Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Content-Disposition: inline _______________________________________________ pve-devel mailing list pve-devel@lists.proxmox.com https://lists.proxmox.com/cgi-bin/mailman/listinfo/pve-devel --===============0026061162758263294==--