all lists on lists.proxmox.com
 help / color / mirror / Atom feed
* [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader
@ 2020-12-21 11:25 Dominik Csapak
  2020-12-21 11:25 ` [pbs-devel] [RFC PATCH proxmox-backup 2/5] api2/admin/datastore: accept "/" as path for root Dominik Csapak
                   ` (4 more replies)
  0 siblings, 5 replies; 8+ messages in thread
From: Dominik Csapak @ 2020-12-21 11:25 UTC (permalink / raw)
  To: pbs-devel

we will reuse that later in the client, so we need it somewhere
we can use from there

Signed-off-by: Dominik Csapak <d.csapak@proxmox.com>
---
 src/api2/admin/datastore.rs | 52 ++++-------------------------
 src/backup/catalog.rs       | 65 +++++++++++++++++++++++++++++++++++++
 2 files changed, 72 insertions(+), 45 deletions(-)

diff --git a/src/api2/admin/datastore.rs b/src/api2/admin/datastore.rs
index 16fee943..5f06a2bf 100644
--- a/src/api2/admin/datastore.rs
+++ b/src/api2/admin/datastore.rs
@@ -1302,7 +1302,7 @@ fn catalog(
     _param: Value,
     _info: &ApiMethod,
     rpcenv: &mut dyn RpcEnvironment,
-) -> Result<Value, Error> {
+) -> Result<Vec<Value>, Error> {
     let datastore = DataStore::lookup_datastore(&store)?;
 
     let auth_id: Authid = rpcenv.get_auth_id().unwrap().parse()?;
@@ -1334,52 +1334,14 @@ fn catalog(
     let reader = BufferedDynamicReader::new(index, chunk_reader);
 
     let mut catalog_reader = CatalogReader::new(reader);
-    let mut current = catalog_reader.root()?;
-    let mut components = vec![];
-
 
-    if filepath != "root" {
-        components = base64::decode(filepath)?;
-        if components.len() > 0 && components[0] == '/' as u8 {
-            components.remove(0);
-        }
-        for component in components.split(|c| *c == '/' as u8) {
-            if let Some(entry) = catalog_reader.lookup(&current, component)? {
-                current = entry;
-            } else {
-                bail!("path {:?} not found in catalog", &String::from_utf8_lossy(&components));
-            }
-        }
-    }
-
-    let mut res = Vec::new();
-
-    for direntry in catalog_reader.read_dir(&current)? {
-        let mut components = components.clone();
-        components.push('/' as u8);
-        components.extend(&direntry.name);
-        let path = base64::encode(components);
-        let text = String::from_utf8_lossy(&direntry.name);
-        let mut entry = json!({
-            "filepath": path,
-            "text": text,
-            "type": CatalogEntryType::from(&direntry.attr).to_string(),
-            "leaf": true,
-        });
-        match direntry.attr {
-            DirEntryAttribute::Directory { start: _ } => {
-                entry["leaf"] = false.into();
-            },
-            DirEntryAttribute::File { size, mtime } => {
-                entry["size"] = size.into();
-                entry["mtime"] = mtime.into();
-            },
-            _ => {},
-        }
-        res.push(entry);
-    }
+    let path = if filepath != "root" {
+        base64::decode(filepath)?
+    } else {
+        vec![b'/']
+    };
 
-    Ok(res.into())
+    catalog_reader.list_dir_content(&path)
 }
 
 fn recurse_files<'a, T, W>(
diff --git a/src/backup/catalog.rs b/src/backup/catalog.rs
index b500fb93..5f8e85a6 100644
--- a/src/backup/catalog.rs
+++ b/src/backup/catalog.rs
@@ -5,6 +5,7 @@ use std::io::{Read, Write, Seek, SeekFrom};
 use std::os::unix::ffi::OsStrExt;
 
 use anyhow::{bail, format_err, Error};
+use serde_json::{json, Value};
 
 use pathpatterns::{MatchList, MatchType};
 use proxmox::tools::io::ReadExt;
@@ -474,6 +475,32 @@ impl <R: Read + Seek> CatalogReader<R> {
         Ok(entry_list)
     }
 
+    /// Lookup a DirEntry from an absolute path
+    pub fn lookup_recursive(
+        &mut self,
+        path: &[u8],
+    ) -> Result<DirEntry, Error> {
+        let mut current = self.root()?;
+        if path == b"/" {
+            return Ok(current);
+        }
+
+        let components = if path.len() > 0 && path[0] == b'/' {
+            &path[1..]
+        } else {
+            path
+        }.split(|c| *c == b'/');
+
+        for comp in components {
+            if let Some(entry) = self.lookup(&current, comp)? {
+                current = entry;
+            } else {
+                bail!("path {:?} not found in catalog", String::from_utf8_lossy(&path));
+            }
+        }
+        Ok(current)
+    }
+
     /// Lockup a DirEntry inside a parent directory
     pub fn lookup(
         &mut self,
@@ -554,6 +581,44 @@ impl <R: Read + Seek> CatalogReader<R> {
         })
     }
 
+    /// Returns the list of content of the given path as json
+    pub fn list_dir_content(&mut self, path: &[u8]) -> Result<Vec<Value>, Error> {
+        let dir = self.lookup_recursive(path)?;
+        let mut res = vec![];
+        let mut path = path.to_vec();
+        if path.len() > 0 && path[0] == b'/' {
+            path.remove(0);
+        }
+
+        for direntry in self.read_dir(&dir)? {
+            let mut components = path.clone();
+            components.push(b'/');
+            components.extend(&direntry.name);
+            let path = base64::encode(&components);
+            let text = String::from_utf8_lossy(&direntry.name);
+            let mut entry = json!({
+                "filepath": path,
+                "text": text,
+                "name": direntry.name.clone(),
+                "type": CatalogEntryType::from(&direntry.attr).to_string(),
+                "leaf": true,
+            });
+            match direntry.attr {
+                DirEntryAttribute::Directory { start: _ } => {
+                    entry["leaf"] = false.into();
+                },
+                DirEntryAttribute::File { size, mtime } => {
+                    entry["size"] = size.into();
+                    entry["mtime"] = mtime.into();
+                },
+                _ => {},
+            }
+            res.push(entry);
+        }
+
+        Ok(res)
+    }
+
     /// Finds all entries matching the given match patterns and calls the
     /// provided callback on them.
     pub fn find(
-- 
2.20.1





^ permalink raw reply	[flat|nested] 8+ messages in thread

* [pbs-devel] [RFC PATCH proxmox-backup 2/5] api2/admin/datastore: accept "/" as path for root
  2020-12-21 11:25 [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader Dominik Csapak
@ 2020-12-21 11:25 ` Dominik Csapak
  2020-12-21 11:25 ` [pbs-devel] [RFC PATCH proxmox-backup 3/5] api2/admin/datastore: refactor create_zip into pxar/extract Dominik Csapak
                   ` (3 subsequent siblings)
  4 siblings, 0 replies; 8+ messages in thread
From: Dominik Csapak @ 2020-12-21 11:25 UTC (permalink / raw)
  To: pbs-devel

makes more sense than sending "root'"

Signed-off-by: Dominik Csapak <d.csapak@proxmox.com>
---
 src/api2/admin/datastore.rs | 2 +-
 www/window/FileBrowser.js   | 1 +
 2 files changed, 2 insertions(+), 1 deletion(-)

diff --git a/src/api2/admin/datastore.rs b/src/api2/admin/datastore.rs
index 5f06a2bf..ad66336c 100644
--- a/src/api2/admin/datastore.rs
+++ b/src/api2/admin/datastore.rs
@@ -1335,7 +1335,7 @@ fn catalog(
 
     let mut catalog_reader = CatalogReader::new(reader);
 
-    let path = if filepath != "root" {
+    let path = if filepath != "root" && filepath != "/" {
         base64::decode(filepath)?
     } else {
         vec![b'/']
diff --git a/www/window/FileBrowser.js b/www/window/FileBrowser.js
index 01b5d79b..724e1791 100644
--- a/www/window/FileBrowser.js
+++ b/www/window/FileBrowser.js
@@ -185,6 +185,7 @@ Ext.define("PBS.window.FileBrowser", {
 	    store: {
 		autoLoad: false,
 		model: 'pbs-file-tree',
+		defaultRootId: '/',
 		nodeParam: 'filepath',
 		sorters: 'text',
 		proxy: {
-- 
2.20.1





^ permalink raw reply	[flat|nested] 8+ messages in thread

* [pbs-devel] [RFC PATCH proxmox-backup 3/5] api2/admin/datastore: refactor create_zip into pxar/extract
  2020-12-21 11:25 [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader Dominik Csapak
  2020-12-21 11:25 ` [pbs-devel] [RFC PATCH proxmox-backup 2/5] api2/admin/datastore: accept "/" as path for root Dominik Csapak
@ 2020-12-21 11:25 ` Dominik Csapak
  2020-12-21 11:25 ` [pbs-devel] [RFC PATCH proxmox-backup 4/5] pxar/extract: add extract_sub_dir Dominik Csapak
                   ` (2 subsequent siblings)
  4 siblings, 0 replies; 8+ messages in thread
From: Dominik Csapak @ 2020-12-21 11:25 UTC (permalink / raw)
  To: pbs-devel

we will reuse that code in the client, so we need to move it to
where we can access it from the client

Signed-off-by: Dominik Csapak <d.csapak@proxmox.com>
---
 src/api2/admin/datastore.rs |  99 +++--------------------------
 src/pxar/extract.rs         | 120 +++++++++++++++++++++++++++++++++++-
 src/pxar/mod.rs             |   5 +-
 3 files changed, 133 insertions(+), 91 deletions(-)

diff --git a/src/api2/admin/datastore.rs b/src/api2/admin/datastore.rs
index ad66336c..84f5417a 100644
--- a/src/api2/admin/datastore.rs
+++ b/src/api2/admin/datastore.rs
@@ -2,8 +2,6 @@ use std::collections::HashSet;
 use std::ffi::OsStr;
 use std::os::unix::ffi::OsStrExt;
 use std::sync::{Arc, Mutex};
-use std::path::{Path, PathBuf};
-use std::pin::Pin;
 
 use anyhow::{bail, format_err, Error};
 use futures::*;
@@ -20,7 +18,7 @@ use proxmox::api::schema::*;
 use proxmox::tools::fs::{replace_file, CreateOptions};
 use proxmox::{http_err, identity, list_subdirs_api_method, sortable};
 
-use pxar::accessor::aio::{Accessor, FileContents, FileEntry};
+use pxar::accessor::aio::Accessor;
 use pxar::EntryKind;
 
 use crate::api2::types::*;
@@ -28,11 +26,11 @@ use crate::api2::node::rrd::create_value_from_rrd;
 use crate::backup::*;
 use crate::config::datastore;
 use crate::config::cached_user_info::CachedUserInfo;
+use crate::pxar::create_zip;
 
 use crate::server::{jobstate::Job, WorkerTask};
 use crate::tools::{
     self,
-    zip::{ZipEncoder, ZipEntry},
     AsyncChannelWriter, AsyncReaderStream, WrappedReaderStream,
 };
 
@@ -1344,66 +1342,6 @@ fn catalog(
     catalog_reader.list_dir_content(&path)
 }
 
-fn recurse_files<'a, T, W>(
-    zip: &'a mut ZipEncoder<W>,
-    decoder: &'a mut Accessor<T>,
-    prefix: &'a Path,
-    file: FileEntry<T>,
-) -> Pin<Box<dyn Future<Output = Result<(), Error>> + Send + 'a>>
-where
-    T: Clone + pxar::accessor::ReadAt + Unpin + Send + Sync + 'static,
-    W: tokio::io::AsyncWrite + Unpin + Send + 'static,
-{
-    Box::pin(async move {
-        let metadata = file.entry().metadata();
-        let path = file.entry().path().strip_prefix(&prefix)?.to_path_buf();
-
-        match file.kind() {
-            EntryKind::File { .. } => {
-                let entry = ZipEntry::new(
-                    path,
-                    metadata.stat.mtime.secs,
-                    metadata.stat.mode as u16,
-                    true,
-                );
-                zip.add_entry(entry, Some(file.contents().await?))
-                   .await
-                   .map_err(|err| format_err!("could not send file entry: {}", err))?;
-            }
-            EntryKind::Hardlink(_) => {
-                let realfile = decoder.follow_hardlink(&file).await?;
-                let entry = ZipEntry::new(
-                    path,
-                    metadata.stat.mtime.secs,
-                    metadata.stat.mode as u16,
-                    true,
-                );
-                zip.add_entry(entry, Some(realfile.contents().await?))
-                   .await
-                   .map_err(|err| format_err!("could not send file entry: {}", err))?;
-            }
-            EntryKind::Directory => {
-                let dir = file.enter_directory().await?;
-                let mut readdir = dir.read_dir();
-                let entry = ZipEntry::new(
-                    path,
-                    metadata.stat.mtime.secs,
-                    metadata.stat.mode as u16,
-                    false,
-                );
-                zip.add_entry::<FileContents<T>>(entry, None).await?;
-                while let Some(entry) = readdir.next().await {
-                    let entry = entry?.decode_entry().await?;
-                    recurse_files(zip, decoder, prefix, entry).await?;
-                }
-            }
-            _ => {} // ignore all else
-        };
-
-        Ok(())
-    })
-}
-
 #[sortable]
 pub const API_METHOD_PXAR_FILE_DOWNLOAD: ApiMethod = ApiMethod::new(
     &ApiHandler::AsyncHttp(&pxar_file_download),
@@ -1479,9 +1417,10 @@ fn pxar_file_download(
 
         let decoder = Accessor::new(reader, archive_size).await?;
         let root = decoder.open_root().await?;
+        let path = OsStr::from_bytes(file_path).to_os_string();
         let file = root
-            .lookup(OsStr::from_bytes(file_path)).await?
-            .ok_or(format_err!("error opening '{:?}'", file_path))?;
+            .lookup(&path).await?
+            .ok_or(format_err!("error opening '{:?}'", path))?;
 
         let body = match file.kind() {
             EntryKind::File { .. } => Body::wrap_stream(
@@ -1495,37 +1434,19 @@ fn pxar_file_download(
                     .map_err(move |err| {
                         eprintln!(
                             "error during streaming of hardlink '{:?}' - {}",
-                            filepath, err
+                            path, err
                         );
                         err
                     }),
             ),
             EntryKind::Directory => {
                 let (sender, receiver) = tokio::sync::mpsc::channel(100);
-                let mut prefix = PathBuf::new();
-                let mut components = file.entry().path().components();
-                components.next_back(); // discar last
-                for comp in components {
-                    prefix.push(comp);
-                }
-
                 let channelwriter = AsyncChannelWriter::new(sender, 1024 * 1024);
-
-                crate::server::spawn_internal_task(async move {
-                    let mut zipencoder = ZipEncoder::new(channelwriter);
-                    let mut decoder = decoder;
-                    recurse_files(&mut zipencoder, &mut decoder, &prefix, file)
-                        .await
-                        .map_err(|err| eprintln!("error during creating of zip: {}", err))?;
-
-                    zipencoder
-                        .finish()
-                        .await
-                        .map_err(|err| eprintln!("error during finishing of zip: {}", err))
-                });
-
+                crate::server::spawn_internal_task(
+                    create_zip(channelwriter, decoder, path.clone(), false)
+                );
                 Body::wrap_stream(receiver.map_err(move |err| {
-                    eprintln!("error during streaming of zip '{:?}' - {}", filepath, err);
+                    eprintln!("error during streaming of zip '{:?}' - {}", path, err);
                     err
                 }))
             }
diff --git a/src/pxar/extract.rs b/src/pxar/extract.rs
index ed238a2c..77472f56 100644
--- a/src/pxar/extract.rs
+++ b/src/pxar/extract.rs
@@ -5,9 +5,11 @@ use std::ffi::{CStr, CString, OsStr, OsString};
 use std::io;
 use std::os::unix::ffi::OsStrExt;
 use std::os::unix::io::{AsRawFd, FromRawFd, RawFd};
-use std::path::Path;
+use std::path::{Path, PathBuf};
 use std::sync::{Arc, Mutex};
+use std::pin::Pin;
 
+use futures::future::Future;
 use anyhow::{bail, format_err, Error};
 use nix::dir::Dir;
 use nix::fcntl::OFlag;
@@ -16,6 +18,7 @@ use nix::sys::stat::Mode;
 use pathpatterns::{MatchEntry, MatchList, MatchType};
 use pxar::format::Device;
 use pxar::Metadata;
+use pxar::accessor::aio::{Accessor, FileContents, FileEntry};
 
 use proxmox::c_result;
 use proxmox::tools::fs::{create_path, CreateOptions};
@@ -24,6 +27,8 @@ use crate::pxar::dir_stack::PxarDirStack;
 use crate::pxar::metadata;
 use crate::pxar::Flags;
 
+use crate::tools::zip::{ZipEncoder, ZipEntry};
+
 pub fn extract_archive<T, F>(
     mut decoder: pxar::decoder::Decoder<T>,
     destination: &Path,
@@ -457,3 +462,116 @@ impl Extractor {
         )
     }
 }
+
+pub async fn create_zip<T, W, P>(
+    output: W,
+    decoder: Accessor<T>,
+    path: P,
+    verbose: bool,
+) -> Result<(), Error>
+where
+    T: Clone + pxar::accessor::ReadAt + Unpin + Send + Sync + 'static,
+    W: tokio::io::AsyncWrite + Unpin + Send + 'static,
+    P: AsRef<Path>,
+{
+    let root = decoder.open_root().await?;
+    let file = root
+        .lookup(&path).await?
+        .ok_or(format_err!("error opening '{:?}'", path.as_ref()))?;
+
+    let mut prefix = PathBuf::new();
+    let mut components = file.entry().path().components();
+    components.next_back(); // discar last
+    for comp in components {
+        prefix.push(comp);
+    }
+
+    let mut zipencoder = ZipEncoder::new(output);
+    let mut decoder = decoder;
+    recurse_files_zip(&mut zipencoder, &mut decoder, &prefix, file, verbose)
+        .await
+        .map_err(|err| {
+            eprintln!("error during creating of zip: {}", err);
+            err
+        })?;
+
+    zipencoder
+        .finish()
+        .await
+        .map_err(|err| {
+            eprintln!("error during finishing of zip: {}", err);
+            err
+        })
+}
+
+fn recurse_files_zip<'a, T, W>(
+    zip: &'a mut ZipEncoder<W>,
+    decoder: &'a mut Accessor<T>,
+    prefix: &'a Path,
+    file: FileEntry<T>,
+    verbose: bool,
+) -> Pin<Box<dyn Future<Output = Result<(), Error>> + Send + 'a>>
+where
+    T: Clone + pxar::accessor::ReadAt + Unpin + Send + Sync + 'static,
+    W: tokio::io::AsyncWrite + Unpin + Send + 'static,
+{
+    use pxar::EntryKind;
+    Box::pin(async move {
+        let metadata = file.entry().metadata();
+        let path = file.entry().path().strip_prefix(&prefix)?.to_path_buf();
+
+        match file.kind() {
+            EntryKind::File { .. } => {
+                if verbose {
+                    eprintln!("adding '{}' to zip", path.display());
+                }
+                let entry = ZipEntry::new(
+                    path,
+                    metadata.stat.mtime.secs,
+                    metadata.stat.mode as u16,
+                    true,
+                );
+                zip.add_entry(entry, Some(file.contents().await?))
+                   .await
+                   .map_err(|err| format_err!("could not send file entry: {}", err))?;
+            }
+            EntryKind::Hardlink(_) => {
+                let realfile = decoder.follow_hardlink(&file).await?;
+                if verbose {
+                    eprintln!("adding '{}' to zip", path.display());
+                }
+                let entry = ZipEntry::new(
+                    path,
+                    metadata.stat.mtime.secs,
+                    metadata.stat.mode as u16,
+                    true,
+                );
+                zip.add_entry(entry, Some(realfile.contents().await?))
+                   .await
+                   .map_err(|err| format_err!("could not send file entry: {}", err))?;
+            }
+            EntryKind::Directory => {
+                let dir = file.enter_directory().await?;
+                let mut readdir = dir.read_dir();
+                if verbose {
+                    eprintln!("adding '{}' to zip", path.display());
+                }
+                let entry = ZipEntry::new(
+                    path,
+                    metadata.stat.mtime.secs,
+                    metadata.stat.mode as u16,
+                    false,
+                );
+                zip.add_entry::<FileContents<T>>(entry, None).await?;
+                while let Some(entry) = readdir.next().await {
+                    let entry = entry?.decode_entry().await?;
+                    recurse_files_zip(zip, decoder, prefix, entry, verbose).await?;
+                }
+            }
+            _ => {} // ignore all else
+        };
+
+        Ok(())
+    })
+}
+
diff --git a/src/pxar/mod.rs b/src/pxar/mod.rs
index 6e910667..ba47e220 100644
--- a/src/pxar/mod.rs
+++ b/src/pxar/mod.rs
@@ -59,7 +59,10 @@ mod flags;
 pub use flags::Flags;
 
 pub use create::create_archive;
-pub use extract::extract_archive;
+pub use extract::{
+    extract_archive,
+    create_zip,
+};
 
 /// The format requires to build sorted directory lookup tables in
 /// memory, so we restrict the number of allowed entries to limit
-- 
2.20.1





^ permalink raw reply	[flat|nested] 8+ messages in thread

* [pbs-devel] [RFC PATCH proxmox-backup 4/5] pxar/extract: add extract_sub_dir
  2020-12-21 11:25 [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader Dominik Csapak
  2020-12-21 11:25 ` [pbs-devel] [RFC PATCH proxmox-backup 2/5] api2/admin/datastore: accept "/" as path for root Dominik Csapak
  2020-12-21 11:25 ` [pbs-devel] [RFC PATCH proxmox-backup 3/5] api2/admin/datastore: refactor create_zip into pxar/extract Dominik Csapak
@ 2020-12-21 11:25 ` Dominik Csapak
  2020-12-21 11:25 ` [pbs-devel] [RFC PATCH 5/5] proxmox-backup-client: add file-restore commands Dominik Csapak
  2020-12-22  5:49 ` [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader Dietmar Maurer
  4 siblings, 0 replies; 8+ messages in thread
From: Dominik Csapak @ 2020-12-21 11:25 UTC (permalink / raw)
  To: pbs-devel

to extract some subdirectory of a pxar into a given target
this will be used in the client

Signed-off-by: Dominik Csapak <d.csapak@proxmox.com>
---
the code looks *very* similar to what we do in 'extract_archive' or
in the fuse extract method, but not quite... maybe there is a good
way to refactor that which i am not seeing?

 src/pxar/extract.rs | 122 ++++++++++++++++++++++++++++++++++++++++++++
 src/pxar/mod.rs     |   1 +
 2 files changed, 123 insertions(+)

diff --git a/src/pxar/extract.rs b/src/pxar/extract.rs
index 77472f56..1f88c6e8 100644
--- a/src/pxar/extract.rs
+++ b/src/pxar/extract.rs
@@ -575,3 +575,125 @@ where
     })
 }
 
+
+pub async fn extract_sub_dir<T, DEST, PATH>(
+    destination: DEST,
+    mut decoder: Accessor<T>,
+    path: PATH,
+    verbose: bool,
+) -> Result<(), Error>
+where
+    T: Clone + pxar::accessor::ReadAt + Unpin + Send + Sync + 'static,
+    DEST: AsRef<Path>,
+    PATH: AsRef<Path>,
+{
+    let root = decoder.open_root().await?;
+
+    create_path(
+        &destination,
+        None,
+        Some(CreateOptions::new().perm(Mode::from_bits_truncate(0o700))),
+    )
+    .map_err(|err| format_err!("error creating directory {:?}: {}", destination.as_ref(), err))?;
+
+    let dir = Dir::open(
+        destination.as_ref(),
+        OFlag::O_DIRECTORY | OFlag::O_CLOEXEC,
+        Mode::empty(),
+    )
+    .map_err(|err| format_err!("unable to open target directory {:?}: {}", destination.as_ref(), err,))?;
+
+    let mut extractor =  Extractor::new(
+        dir,
+        root.lookup_self().await?.entry().metadata().clone(),
+        false,
+        Flags::DEFAULT,
+    );
+
+    let file = root
+        .lookup(&path).await?
+        .ok_or(format_err!("error opening '{:?}'", path.as_ref()))?;
+
+    recurse_files_extractor(&mut extractor, &mut decoder, file, verbose).await
+}
+
+fn recurse_files_extractor<'a, T>(
+    extractor: &'a mut Extractor,
+    decoder: &'a mut Accessor<T>,
+    file: FileEntry<T>,
+    verbose: bool,
+) -> Pin<Box<dyn Future<Output = Result<(), Error>> + Send + 'a>>
+where
+    T: Clone + pxar::accessor::ReadAt + Unpin + Send + Sync + 'static,
+{
+    use pxar::EntryKind;
+    Box::pin(async move {
+        let metadata = file.entry().metadata();
+        let file_name_os = file.file_name();
+
+        // safety check: a file entry in an archive must never contain slashes:
+        if file_name_os.as_bytes().contains(&b'/') {
+            bail!("archive file entry contains slashes, which is invalid and a security concern");
+        }
+
+        let file_name = CString::new(file_name_os.as_bytes())
+            .map_err(|_| format_err!("encountered file name with null-bytes"))?;
+
+        if verbose {
+            eprintln!("extracting: {}", file.path().display());
+        }
+
+        match file.kind() {
+            EntryKind::Directory => {
+                extractor
+                    .enter_directory(file_name_os.to_owned(), metadata.clone(), true)
+                    .map_err(|err| format_err!("error at entry {:?}: {}", file_name_os, err))?;
+
+                let dir = file.enter_directory().await?;
+                let mut readdir = dir.read_dir();
+                while let Some(entry) = readdir.next().await {
+                    let entry = entry?.decode_entry().await?;
+                    let filename = entry.path().to_path_buf();
+
+                    // log errors and continue
+                    if let Err(err) = recurse_files_extractor(extractor, decoder, entry, verbose).await {
+                        eprintln!("error extracting {:?}: {}", filename.display(), err);
+                    }
+                }
+                extractor.leave_directory()?;
+            }
+            EntryKind::Symlink(link) => {
+                extractor.extract_symlink(&file_name, metadata, link.as_ref())?;
+            }
+            EntryKind::Hardlink(link) => {
+                extractor.extract_hardlink(&file_name, link.as_os_str())?;
+            }
+            EntryKind::Device(dev) => {
+                if extractor.contains_flags(Flags::WITH_DEVICE_NODES) {
+                    extractor.extract_device(&file_name, metadata, dev)?;
+                }
+            }
+            EntryKind::Fifo => {
+                if extractor.contains_flags(Flags::WITH_FIFOS) {
+                    extractor.extract_special(&file_name, metadata, 0)?;
+                }
+            }
+            EntryKind::Socket => {
+                if extractor.contains_flags(Flags::WITH_SOCKETS) {
+                    extractor.extract_special(&file_name, metadata, 0)?;
+                }
+            }
+            EntryKind::File { size, .. } => extractor.async_extract_file(
+                &file_name,
+                metadata,
+                *size,
+                &mut file.contents().await.map_err(|_| {
+                    format_err!("found regular file entry without contents in archive")
+                })?,
+            ).await?,
+            EntryKind::GoodbyeTable => {}, // ignore
+        }
+        Ok(())
+    })
+}
+
diff --git a/src/pxar/mod.rs b/src/pxar/mod.rs
index ba47e220..d3c1ca16 100644
--- a/src/pxar/mod.rs
+++ b/src/pxar/mod.rs
@@ -62,6 +62,7 @@ pub use create::create_archive;
 pub use extract::{
     extract_archive,
     create_zip,
+    extract_sub_dir,
 };
 
 /// The format requires to build sorted directory lookup tables in
-- 
2.20.1





^ permalink raw reply	[flat|nested] 8+ messages in thread

* [pbs-devel] [RFC PATCH 5/5] proxmox-backup-client: add file-restore commands
  2020-12-21 11:25 [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader Dominik Csapak
                   ` (2 preceding siblings ...)
  2020-12-21 11:25 ` [pbs-devel] [RFC PATCH proxmox-backup 4/5] pxar/extract: add extract_sub_dir Dominik Csapak
@ 2020-12-21 11:25 ` Dominik Csapak
  2020-12-21 11:43   ` Dominik Csapak
  2020-12-22  5:49 ` [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader Dietmar Maurer
  4 siblings, 1 reply; 8+ messages in thread
From: Dominik Csapak @ 2020-12-21 11:25 UTC (permalink / raw)
  To: pbs-devel

for now we only have 'list' and 'extract' and it is only supported
for 'pxar.didx' files

this should be the foundation for a general file-restore interface
that is shared with a file-restore from block-level backups

Signed-off-by: Dominik Csapak <d.csapak@proxmox.com>
---
this patch is mostly for @Stefan so that we can coordinate the interface
for file-restoring

i am not completely sure about how i handle the zip/non-zip case here
(it seems a bit too automagic) but having an explicit 'zip' parameter
does not make much sense, since printing a dir to stdout does not
work any other way? (we cannot print partial pxar files)

 Cargo.toml                                    |   2 +-
 src/bin/proxmox-backup-client.rs              |   1 +
 src/bin/proxmox_backup_client/file_restore.rs | 329 ++++++++++++++++++
 src/bin/proxmox_backup_client/mod.rs          |   2 +
 4 files changed, 333 insertions(+), 1 deletion(-)
 create mode 100644 src/bin/proxmox_backup_client/file_restore.rs

diff --git a/Cargo.toml b/Cargo.toml
index bfe39e75..66f536f6 100644
--- a/Cargo.toml
+++ b/Cargo.toml
@@ -60,7 +60,7 @@ serde = { version = "1.0", features = ["derive"] }
 serde_json = "1.0"
 siphasher = "0.3"
 syslog = "4.0"
-tokio = { version = "0.2.9", features = [ "blocking", "fs", "dns", "io-util", "macros", "process", "rt-threaded", "signal", "stream", "tcp", "time", "uds" ] }
+tokio = { version = "0.2.9", features = [ "blocking", "fs", "dns", "io-util", "io-std", "macros", "process", "rt-threaded", "signal", "stream", "tcp", "time", "uds" ] }
 tokio-openssl = "0.4.0"
 tokio-util = { version = "0.3", features = [ "codec" ] }
 tower-service = "0.3.0"
diff --git a/src/bin/proxmox-backup-client.rs b/src/bin/proxmox-backup-client.rs
index 6cf81952..8585d24f 100644
--- a/src/bin/proxmox-backup-client.rs
+++ b/src/bin/proxmox-backup-client.rs
@@ -1869,6 +1869,7 @@ fn main() {
         .insert("version", version_cmd_def)
         .insert("benchmark", benchmark_cmd_def)
         .insert("change-owner", change_owner_cmd_def)
+        .insert("file-restore", file_restore_mgmt_cli())
 
         .alias(&["files"], &["snapshot", "files"])
         .alias(&["forget"], &["snapshot", "forget"])
diff --git a/src/bin/proxmox_backup_client/file_restore.rs b/src/bin/proxmox_backup_client/file_restore.rs
new file mode 100644
index 00000000..0cb30117
--- /dev/null
+++ b/src/bin/proxmox_backup_client/file_restore.rs
@@ -0,0 +1,329 @@
+use std::sync::Arc;
+use std::path::PathBuf;
+use std::ffi::OsStr;
+use std::os::unix::ffi::OsStrExt;
+
+use anyhow::{bail, format_err, Error};
+use serde_json::{json, Value};
+
+use proxmox::api::{
+    api,
+    cli::{
+        CliCommandMap,
+        CliCommand,
+    },
+};
+use pxar::accessor::aio::Accessor;
+
+use proxmox_backup::pxar::{create_zip, extract_sub_dir};
+use proxmox_backup::tools;
+use proxmox_backup::backup::CryptMode;
+use proxmox_backup::backup::LocalDynamicReadAt;
+use proxmox_backup::client::{
+    BackupReader,
+    RemoteChunkReader,
+};
+use crate::{
+    CryptConfig,
+    keyfile_parameters,
+    BackupDir,
+    CATALOG_NAME,
+    decrypt_key,
+    complete_repository,
+    KEYFD_SCHEMA,
+    BufferedDynamicReader,
+    CatalogReader,
+    connect,
+    extract_repository_from_value,
+    KEYFILE_SCHEMA,
+    REPO_URL_SCHEMA,
+    complete_group_or_snapshot,
+    key,
+    IndexFile,
+};
+
+enum ExtractPath {
+    ListArchives,
+    Pxar(String, Vec<u8>),
+}
+
+fn parse_path(path: String, base64: bool) -> Result<ExtractPath, Error> {
+    let mut bytes = if base64 {
+        base64::decode(path)?
+    } else {
+        path.into_bytes()
+    };
+
+    if bytes == b"/" {
+        return Ok(ExtractPath::ListArchives);
+    }
+
+    while bytes.len() > 0 && bytes[0] == b'/' {
+        bytes.remove(0);
+    }
+
+    let (file, path) = {
+        let slash_pos = bytes.iter().position(|c| *c == b'/').unwrap_or(bytes.len());
+        let path = bytes.split_off(slash_pos);
+        let file = String::from_utf8(bytes)?;
+        (file, path)
+    };
+
+    if file.ends_with(".pxar.didx") {
+        Ok(ExtractPath::Pxar(file, path))
+    } else {
+        bail!("'{}' is not supported for file-restore", file);
+    }
+}
+
+#[api(
+   input: {
+       properties: {
+           repository: {
+               schema: REPO_URL_SCHEMA,
+               optional: true,
+           },
+           snapshot: {
+               type: String,
+               description: "Group/Snapshot path.",
+           },
+           "path": {
+               description: "Path to restore. Directories will be restored as .zip files.",
+               type: String,
+           },
+           "base64": {
+               type: Boolean,
+               description: "If set, 'path' will be interpreted as base64 encoded.",
+               optional: true,
+               default: false,
+           },
+           keyfile: {
+               schema: KEYFILE_SCHEMA,
+               optional: true,
+           },
+           "keyfd": {
+               schema: KEYFD_SCHEMA,
+               optional: true,
+           },
+           "crypt-mode": {
+               type: CryptMode,
+               optional: true,
+           },
+       }
+   }
+)]
+/// List a directory from a backup snapshot.
+async fn list(param: Value) -> Result<Vec<Value>, Error> {
+    let repo = extract_repository_from_value(&param)?;
+    let base64 = param["base64"].as_bool().unwrap_or(false);
+    let path = parse_path(tools::required_string_param(&param, "path")?.to_string(), base64)?;
+    let snapshot: BackupDir = tools::required_string_param(&param, "snapshot")?.parse()?;
+
+    let (keydata, _crypt_mode) = keyfile_parameters(&param)?;
+    let crypt_config = match keydata {
+        None => None,
+        Some(key) => {
+            let (key, _, fingerprint) = decrypt_key(&key, &key::get_encryption_key_password)?;
+            eprintln!("Encryption key fingerprint: '{}'", fingerprint);
+            Some(Arc::new(CryptConfig::new(key)?))
+        }
+    };
+
+    let client = connect(&repo)?;
+    let client = BackupReader::start(
+        client,
+        crypt_config.clone(),
+        repo.store(),
+        &snapshot.group().backup_type(),
+        &snapshot.group().backup_id(),
+        snapshot.backup_time(),
+        true,
+    ).await?;
+
+    let (manifest, _) = client.download_manifest().await?;
+    manifest.check_fingerprint(crypt_config.as_ref().map(Arc::as_ref))?;
+
+    match path {
+        ExtractPath::ListArchives => {
+            let mut entries = vec![];
+            let mut has_fidx = false;
+            for file in manifest.files() {
+                match file.filename.rsplitn(2, '.').next().unwrap() {
+                    "didx" => {},
+                    "fidx" => {
+                        has_fidx = true;
+                        continue;
+                    }
+                    _ => continue, // ignore all non fidx/didx
+                }
+                let path = format!("/{}", file.filename);
+                entries.push(json!({
+                    "path": path.clone(),
+                    "base64": base64::encode(path.into_bytes()),
+                    "leaf": false,
+                }))
+            }
+            if has_fidx {
+                entries.push(json!({
+                    "path": "/block",
+                    "base64": base64::encode(b"/block"),
+                    "leaf": false,
+                }));
+            }
+
+            Ok(entries.into())
+        },
+        ExtractPath::Pxar(file, mut path) => {
+            let index = client.download_dynamic_index(&manifest, CATALOG_NAME).await?;
+            let most_used = index.find_most_used_chunks(8);
+            let file_info = manifest.lookup_file_info(&CATALOG_NAME)?;
+            let chunk_reader = RemoteChunkReader::new(client.clone(), crypt_config, file_info.chunk_crypt_mode(), most_used);
+            let reader = BufferedDynamicReader::new(index, chunk_reader);
+            let mut catalog_reader = CatalogReader::new(reader);
+
+            let mut fullpath = file.into_bytes();
+            fullpath.append(&mut path);
+
+            catalog_reader.list_dir_content(&fullpath)
+        },
+    }
+}
+
+#[api(
+   input: {
+       properties: {
+           repository: {
+               schema: REPO_URL_SCHEMA,
+               optional: true,
+           },
+           snapshot: {
+               type: String,
+               description: "Group/Snapshot path.",
+           },
+           "path": {
+               description: "Path to restore. Directories will be restored as .zip files if extracted to stdout.",
+               type: String,
+           },
+           "base64": {
+               type: Boolean,
+               description: "If set, 'path' will be interpreted as base64 encoded.",
+               optional: true,
+               default: false,
+           },
+           target: {
+               type: String,
+               optional: true,
+               description: "Target directory path. Use '-' to write to standard output.",
+           },
+           keyfile: {
+               schema: KEYFILE_SCHEMA,
+               optional: true,
+           },
+           "keyfd": {
+               schema: KEYFD_SCHEMA,
+               optional: true,
+           },
+           "crypt-mode": {
+               type: CryptMode,
+               optional: true,
+           },
+           verbose: {
+               type: Boolean,
+               description: "Print verbose information",
+               optional: true,
+               default: false,
+           }
+       }
+   }
+)]
+/// Restore files from a backup snapshot.
+async fn extract(param: Value) -> Result<Value, Error> {
+    let repo = extract_repository_from_value(&param)?;
+    let verbose = param["verbose"].as_bool().unwrap_or(false);
+    let base64 = param["base64"].as_bool().unwrap_or(false);
+    let orig_path = tools::required_string_param(&param, "path")?.to_string();
+    let path = parse_path(orig_path.clone(), base64)?;
+
+    let target = match param["target"].as_str() {
+        Some(target) if target == "-" => None,
+        Some(target) => Some(PathBuf::from(target)),
+        None => Some(std::env::current_dir()?),
+    };
+
+    let snapshot: BackupDir = tools::required_string_param(&param, "snapshot")?.parse()?;
+
+    let (keydata, _crypt_mode) = keyfile_parameters(&param)?;
+    let crypt_config = match keydata {
+        None => None,
+        Some(key) => {
+            let (key, _, fingerprint) = decrypt_key(&key, &key::get_encryption_key_password)?;
+            eprintln!("Encryption key fingerprint: '{}'", fingerprint);
+            Some(Arc::new(CryptConfig::new(key)?))
+        }
+    };
+
+    match path {
+        ExtractPath::Pxar(archive_name, path) => {
+            let client = connect(&repo)?;
+            let client = BackupReader::start(
+                client,
+                crypt_config.clone(),
+                repo.store(),
+                &snapshot.group().backup_type(),
+                &snapshot.group().backup_id(),
+                snapshot.backup_time(),
+                true,
+            ).await?;
+            let (manifest, _) = client.download_manifest().await?;
+            let file_info = manifest.lookup_file_info(&archive_name)?;
+            let index = client.download_dynamic_index(&manifest, &archive_name).await?;
+            let most_used = index.find_most_used_chunks(8);
+            let chunk_reader = RemoteChunkReader::new(client.clone(), crypt_config, file_info.chunk_crypt_mode(), most_used);
+            let reader = BufferedDynamicReader::new(index, chunk_reader);
+
+            let archive_size = reader.archive_size();
+            let reader = LocalDynamicReadAt::new(reader);
+            let decoder = Accessor::new(reader, archive_size).await?;
+
+            let root = decoder.open_root().await?;
+            let file = root
+                .lookup(OsStr::from_bytes(&path)).await?
+                .ok_or(format_err!("error opening '{:?}'", path))?;
+
+            if let Some(target) = target {
+                extract_sub_dir(target, decoder, OsStr::from_bytes(&path), verbose).await?;
+            } else {
+                match file.kind() {
+                    pxar::EntryKind::File { .. } => {
+                        tokio::io::copy(&mut file.contents().await?, &mut tokio::io::stdout()).await?;
+                    }
+                    _ => {
+                        create_zip(tokio::io::stdout(), decoder, OsStr::from_bytes(&path), verbose).await?;
+                    }
+                }
+            }
+        },
+        _ => {
+            bail!("cannot extract '{}'", orig_path);
+        }
+    }
+
+    Ok(Value::Null)
+}
+
+pub fn file_restore_mgmt_cli() -> CliCommandMap {
+    let list_cmd_def = CliCommand::new(&API_METHOD_LIST)
+        .arg_param(&["snapshot", "path"])
+        .completion_cb("repository", complete_repository)
+        .completion_cb("snapshot", complete_group_or_snapshot);
+
+    let restore_cmd_def = CliCommand::new(&API_METHOD_EXTRACT)
+        .arg_param(&["snapshot", "path", "target"])
+        .completion_cb("repository", complete_repository)
+        .completion_cb("snapshot", complete_group_or_snapshot)
+        .completion_cb("target", tools::complete_file_name);
+
+    CliCommandMap::new()
+        .insert("list", list_cmd_def)
+        .insert("extract", restore_cmd_def)
+}
diff --git a/src/bin/proxmox_backup_client/mod.rs b/src/bin/proxmox_backup_client/mod.rs
index a14b0dc1..7787e91a 100644
--- a/src/bin/proxmox_backup_client/mod.rs
+++ b/src/bin/proxmox_backup_client/mod.rs
@@ -10,6 +10,8 @@ mod catalog;
 pub use catalog::*;
 mod snapshot;
 pub use snapshot::*;
+mod file_restore;
+pub use file_restore::*;
 
 pub mod key;
 
-- 
2.20.1





^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: [pbs-devel] [RFC PATCH 5/5] proxmox-backup-client: add file-restore commands
  2020-12-21 11:25 ` [pbs-devel] [RFC PATCH 5/5] proxmox-backup-client: add file-restore commands Dominik Csapak
@ 2020-12-21 11:43   ` Dominik Csapak
  0 siblings, 0 replies; 8+ messages in thread
From: Dominik Csapak @ 2020-12-21 11:43 UTC (permalink / raw)
  To: pbs-devel

this is ofc also for 'proxmox-backup'

just fyi, i did not run rustfm or clippy on the code, but i'll do that
once the design/interface issues are resolved




^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader
  2020-12-21 11:25 [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader Dominik Csapak
                   ` (3 preceding siblings ...)
  2020-12-21 11:25 ` [pbs-devel] [RFC PATCH 5/5] proxmox-backup-client: add file-restore commands Dominik Csapak
@ 2020-12-22  5:49 ` Dietmar Maurer
  2020-12-22  7:52   ` Dominik Csapak
  4 siblings, 1 reply; 8+ messages in thread
From: Dietmar Maurer @ 2020-12-22  5:49 UTC (permalink / raw)
  To: Proxmox Backup Server development discussion, Dominik Csapak

comments inline

> On 12/21/2020 12:25 PM Dominik Csapak <d.csapak@proxmox.com> wrote:
> 
>  
> we will reuse that later in the client, so we need it somewhere
> we can use from there
> 
> Signed-off-by: Dominik Csapak <d.csapak@proxmox.com>
> ---
>  src/api2/admin/datastore.rs | 52 ++++-------------------------
>  src/backup/catalog.rs       | 65 +++++++++++++++++++++++++++++++++++++
>  2 files changed, 72 insertions(+), 45 deletions(-)
> 
> diff --git a/src/api2/admin/datastore.rs b/src/api2/admin/datastore.rs
> index 16fee943..5f06a2bf 100644
> --- a/src/api2/admin/datastore.rs
> +++ b/src/api2/admin/datastore.rs
> @@ -1302,7 +1302,7 @@ fn catalog(
>      _param: Value,
>      _info: &ApiMethod,
>      rpcenv: &mut dyn RpcEnvironment,
> -) -> Result<Value, Error> {
> +) -> Result<Vec<Value>, Error> {
>      let datastore = DataStore::lookup_datastore(&store)?;
>  
>      let auth_id: Authid = rpcenv.get_auth_id().unwrap().parse()?;
> @@ -1334,52 +1334,14 @@ fn catalog(
>      let reader = BufferedDynamicReader::new(index, chunk_reader);
>  
>      let mut catalog_reader = CatalogReader::new(reader);
> -    let mut current = catalog_reader.root()?;
> -    let mut components = vec![];
> -
>  
> -    if filepath != "root" {
> -        components = base64::decode(filepath)?;
> -        if components.len() > 0 && components[0] == '/' as u8 {
> -            components.remove(0);
> -        }
> -        for component in components.split(|c| *c == '/' as u8) {
> -            if let Some(entry) = catalog_reader.lookup(&current, component)? {
> -                current = entry;
> -            } else {
> -                bail!("path {:?} not found in catalog", &String::from_utf8_lossy(&components));
> -            }
> -        }
> -    }
> -
> -    let mut res = Vec::new();
> -
> -    for direntry in catalog_reader.read_dir(&current)? {
> -        let mut components = components.clone();
> -        components.push('/' as u8);
> -        components.extend(&direntry.name);
> -        let path = base64::encode(components);
> -        let text = String::from_utf8_lossy(&direntry.name);
> -        let mut entry = json!({
> -            "filepath": path,
> -            "text": text,
> -            "type": CatalogEntryType::from(&direntry.attr).to_string(),
> -            "leaf": true,
> -        });
> -        match direntry.attr {
> -            DirEntryAttribute::Directory { start: _ } => {
> -                entry["leaf"] = false.into();
> -            },
> -            DirEntryAttribute::File { size, mtime } => {
> -                entry["size"] = size.into();
> -                entry["mtime"] = mtime.into();
> -            },
> -            _ => {},
> -        }
> -        res.push(entry);
> -    }
> +    let path = if filepath != "root" {
> +        base64::decode(filepath)?
> +    } else {
> +        vec![b'/']
> +    };
>  
> -    Ok(res.into())
> +    catalog_reader.list_dir_content(&path)
>  }
>  
>  fn recurse_files<'a, T, W>(
> diff --git a/src/backup/catalog.rs b/src/backup/catalog.rs
> index b500fb93..5f8e85a6 100644
> --- a/src/backup/catalog.rs
> +++ b/src/backup/catalog.rs
> @@ -5,6 +5,7 @@ use std::io::{Read, Write, Seek, SeekFrom};
>  use std::os::unix::ffi::OsStrExt;
>  
>  use anyhow::{bail, format_err, Error};
> +use serde_json::{json, Value};
>  
>  use pathpatterns::{MatchList, MatchType};
>  use proxmox::tools::io::ReadExt;
> @@ -474,6 +475,32 @@ impl <R: Read + Seek> CatalogReader<R> {
>          Ok(entry_list)
>      }
>  
> +    /// Lookup a DirEntry from an absolute path
> +    pub fn lookup_recursive(
> +        &mut self,
> +        path: &[u8],
> +    ) -> Result<DirEntry, Error> {
> +        let mut current = self.root()?;
> +        if path == b"/" {
> +            return Ok(current);
> +        }
> +
> +        let components = if path.len() > 0 && path[0] == b'/' {
> +            &path[1..]
> +        } else {
> +            path
> +        }.split(|c| *c == b'/');
> +
> +        for comp in components {
> +            if let Some(entry) = self.lookup(&current, comp)? {
> +                current = entry;
> +            } else {
> +                bail!("path {:?} not found in catalog", String::from_utf8_lossy(&path));
> +            }
> +        }
> +        Ok(current)
> +    }
> +

This is OK for me

>      /// Lockup a DirEntry inside a parent directory
>      pub fn lookup(
>          &mut self,
> @@ -554,6 +581,44 @@ impl <R: Read + Seek> CatalogReader<R> {
>          })
>      }
>  
> +    /// Returns the list of content of the given path as json
> +    pub fn list_dir_content(&mut self, path: &[u8]) -> Result<Vec<Value>, Error> {
> +        let dir = self.lookup_recursive(path)?;
> +        let mut res = vec![];
> +        let mut path = path.to_vec();
> +        if path.len() > 0 && path[0] == b'/' {
> +            path.remove(0);
> +        }
> +
> +        for direntry in self.read_dir(&dir)? {
> +            let mut components = path.clone();
> +            components.push(b'/');
> +            components.extend(&direntry.name);
> +            let path = base64::encode(&components);
> +            let text = String::from_utf8_lossy(&direntry.name);
> +            let mut entry = json!({
> +                "filepath": path,
> +                "text": text,
> +                "name": direntry.name.clone(),
> +                "type": CatalogEntryType::from(&direntry.attr).to_string(),
> +                "leaf": true,
> +            });
> +            match direntry.attr {
> +                DirEntryAttribute::Directory { start: _ } => {
> +                    entry["leaf"] = false.into();
> +                },
> +                DirEntryAttribute::File { size, mtime } => {
> +                    entry["size"] = size.into();
> +                    entry["mtime"] = mtime.into();
> +                },
> +                _ => {},
> +            }
> +            res.push(entry);
> +        }

But this is API code, so we should find another place for this.

> +
> +        Ok(res)
> +    }
> +
>      /// Finds all entries matching the given match patterns and calls the
>      /// provided callback on them.
>      pub fn find(
> -- 
> 2.20.1
> 
> 
> 
> _______________________________________________
> pbs-devel mailing list
> pbs-devel@lists.proxmox.com
> https://lists.proxmox.com/cgi-bin/mailman/listinfo/pbs-devel




^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader
  2020-12-22  5:49 ` [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader Dietmar Maurer
@ 2020-12-22  7:52   ` Dominik Csapak
  0 siblings, 0 replies; 8+ messages in thread
From: Dominik Csapak @ 2020-12-22  7:52 UTC (permalink / raw)
  To: Dietmar Maurer, Proxmox Backup Server development discussion

[snip]

> 
> But this is API code, so we should find another place for this.

Sounds reasonable to me, any suggestions?
The reason i put it there was that i want to use it from api as well as 
the client.
Or is it ok to use code from src/api2/ in the client? (i thought not)

> 
>> +
>> +        Ok(res)
>> +    }
>> +
>>       /// Finds all entries matching the given match patterns and calls the
>>       /// provided callback on them.
>>       pub fn find(
>> -- 
>> 2.20.1
>>
>>
>>
>> _______________________________________________
>> pbs-devel mailing list
>> pbs-devel@lists.proxmox.com
>> https://lists.proxmox.com/cgi-bin/mailman/listinfo/pbs-devel





^ permalink raw reply	[flat|nested] 8+ messages in thread

end of thread, other threads:[~2020-12-22  7:52 UTC | newest]

Thread overview: 8+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2020-12-21 11:25 [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader Dominik Csapak
2020-12-21 11:25 ` [pbs-devel] [RFC PATCH proxmox-backup 2/5] api2/admin/datastore: accept "/" as path for root Dominik Csapak
2020-12-21 11:25 ` [pbs-devel] [RFC PATCH proxmox-backup 3/5] api2/admin/datastore: refactor create_zip into pxar/extract Dominik Csapak
2020-12-21 11:25 ` [pbs-devel] [RFC PATCH proxmox-backup 4/5] pxar/extract: add extract_sub_dir Dominik Csapak
2020-12-21 11:25 ` [pbs-devel] [RFC PATCH 5/5] proxmox-backup-client: add file-restore commands Dominik Csapak
2020-12-21 11:43   ` Dominik Csapak
2020-12-22  5:49 ` [pbs-devel] [RFC PATCH proxmox-backup 1/5] api2/admin/datastore: refactor list_dir_content in catalog_reader Dietmar Maurer
2020-12-22  7:52   ` Dominik Csapak

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.
Service provided by Proxmox Server Solutions GmbH | Privacy | Legal