Fixes some crashes when permissions are denied

This commit is contained in:
Rafał Mikrut 2020-09-07 09:06:12 +02:00
parent a47ce77bc6
commit bbb13d79ce
4 changed files with 75 additions and 19 deletions

View File

@ -1,13 +1,13 @@
# Czkawka
Czkawka is simple and easy to use alternative to Fslint written in Rust.
Czkawka is simple, fast and easy to use alternative to Fslint, written in Rust.
It is in very early development, so most of the functions aren't added and doesn't work.
This is my first ever project in Rust so probably a lot of things are written in the most optimal way.
## Done
- Rich instruction with examples - CLI(`cargo run --bin czkawka_cli`)
- Duplicated file finding - CLI
- Including and excluding directories(absolute pathes)
- Option to remove files in different ways
- Including and excluding directories(absolute paths)
- Option to remove all except newest, oldest and one oldest or newest
- Fast(by size) or accurate(by hash) file checking
- Empty folders finding - CLI
- Advanced empty files finding(finds and remove folders which contains only empty folders)
@ -21,19 +21,24 @@ This is my first ever project in Rust so probably a lot of things are written in
- Duplicated file finding - CLI
- saving results to file
- support for * when excluding files and folders
- GUI(GTK)
- Alternative GUI with orbtk
- GUI with GTK
- Alternative GUI with Orbtk
- Finding files with debug symbols
- Support for showing only duplicates with specific extension, name(Regex support needed)
- Maybe windows support, but this will need some refactoring in code
- Translation support
## Usage
## Usage and requirements
Rustc 1.46 works fine(not sure about a minimal version)
GTK 3.18 - for GTK backend
For now only Linux(and probably also macOS) is supported
- Install requirements for GTK(minimum 3.16)
- Install requirements for GTK
```
apt install -y libgtk-3-dev
```
- Download source
```
git clone https://github.com/qarmin/czkawka.git
@ -54,20 +59,37 @@ cargo run --bin czkawka_cli
## How it works?
### Duplicate Finder
The only required parameter for checking duplicates is included folders `-i`. This parameter validates provided folders - which must have absolute path(without ~ and other similar symbols at the beginning), not contains *(wildcard), be dir(not file or symlink), exists. Later same things are done with excluded folders `-e`.
The only required parameter for checking duplicates is included folders `-i`. This parameter validates provided folders - which must have absolute path(without ~ and other similar symbols at the beginning), not contains *(wildcard), be dir(not file or symlink), exists. Later same things are done with excluded folders `-e`.
Next, this included and excluded folders are optimized due to tree structure of file system:
- Folders which contains another folders are combined(separately for included and excluded) - `/home/pulpet` and `/home/pulpet/a` are combined to `/home/pulpet`
- Inlcuded folders which are located inside excluded ones are delete - Included folder `/etc/tomcat/` is deleted because excluded folder is `/etc/`
- Non existed directories are removed
- Included folders which are located inside excluded ones are delete - Included folder `/etc/tomcat/` is deleted because excluded folder is `/etc/`
- Non existed directories are being removed
- Excluded path which are outside include path are deleted - Exclude path `/etc/` is removed if included path is `/home/`
If after optimization there is no include folders, then program ends with non zero value(TODO, this should be handled by returning value).
Next with provided by user minimal size of checked size `-s`, program checks recursively(TODO should be an option to turn off a recursion) included folders and checks files by sizes and put it files with same sizes to different boxes.
Next with provided by user minimal size of checked size `-s`, program checks recursively(TODO should be an option to turn off a recursion) included folders and checks files by sizes and put it files with same sizes to different boxes.
Next boxes which contains only one element are removed because files inside are not duplicated.
Next by default also is checked hash to be sure that files with equal size are identical.
## Speed
Since Czkawka is written in Rust and aims to be a faster alternative for written in Python - FSlint we need to compare speed of this two tools.
I checked my home directory without any folder exceptions(I removed all directories from FSlint advanced tab) which contained 379359 files and 42445 folders and 50301 duplicated files in 29723 groups which took 450,4 MB.
First run reads file entry and save it to cache so this step is mostly limited by disk performance, and with second run cache helps it so searching is a lot of faster.
| App| Executing Time |
|:----------:|:-------------:|
| Fslint (First Run)| 140s |
| Fslint (Second Run)| 23s |
| Czkawka CLI Debug(First Run) | 136s |
| Czkawka CLI Debug(Second Run) | 14s |
| Czkawka CLI Release(First Run) | 128s |
| Czkawka CLI Release(Second Run) | 8s |
## License
Code is distributed under MIT license.

View File

@ -280,8 +280,15 @@ impl DuplicateFinder {
};
for entry in read_dir {
let entry_data = entry.unwrap();
let metadata: Metadata = entry_data.metadata().unwrap();
let metadata: Metadata = match entry_data.metadata() {
Ok(t) => t,
Err(_) => continue, //Permissions denied
};
if metadata.is_dir() {
if entry_data.file_name().into_string().is_err() {
continue; // Permissions denied
}
let mut is_excluded_dir = false;
next_folder = "".to_owned() + &current_folder + &entry_data.file_name().into_string().unwrap() + "/";
for ed in &self.excluded_directories {
@ -319,8 +326,14 @@ impl DuplicateFinder {
let fe: FileEntry = FileEntry {
path: current_file_name,
size: metadata.len(),
created_date: metadata.created().unwrap(),
modified_date: metadata.modified().unwrap(),
created_date: match metadata.created() {
Ok(t) => t,
Err(_) => SystemTime::now(),
},
modified_date: match metadata.modified() {
Ok(t) => t,
Err(_) => SystemTime::now(),
},
};
// // self.files_with_identical_size.entry from below should be faster according to clippy
// if !self.files_with_identical_size.contains_key(&metadata.len()) {
@ -385,18 +398,28 @@ impl DuplicateFinder {
}
};
let mut error_reading_file: bool = false;
let mut hasher: blake3::Hasher = blake3::Hasher::new();
let mut buffer = [0u8; 16384];
loop {
let n = file_handler.read(&mut buffer).unwrap();
let n = match file_handler.read(&mut buffer) {
Ok(t) => t,
Err(_) => {
error_reading_file = true;
break;
}
}; //.unwrap();
if n == 0 {
break;
}
hasher.update(&buffer[..n]);
}
let hash_string: String = hasher.finalize().to_hex().to_string();
hashmap_with_hash.entry(hash_string.to_string()).or_insert_with(Vec::new);
hashmap_with_hash.get_mut(&*hash_string).unwrap().push(file_entry.1.to_owned());
if !error_reading_file {
let hash_string: String = hasher.finalize().to_hex().to_string();
hashmap_with_hash.entry(hash_string.to_string()).or_insert_with(Vec::new);
hashmap_with_hash.get_mut(&*hash_string).unwrap().push(file_entry.1.to_owned());
}
}
for hash_entry in hashmap_with_hash {
if hash_entry.1.len() > 1 {
@ -648,6 +671,12 @@ impl DuplicateFinder {
Common::print_time(start_time, SystemTime::now(), "delete_files".to_string());
}
}
impl Default for DuplicateFinder {
fn default() -> Self {
Self::new()
}
}
fn delete_files(vector: &[FileEntry], delete_method: &DeleteMethod, errors: &mut Vec<String>) {
assert!(vector.len() > 1, "Vector length must be bigger than 1(This should be done in previous steps).");
let mut q_index: usize = 0;

View File

@ -463,3 +463,8 @@ impl EmptyFolder {
//Common::print_time(start_time, SystemTime::now(), "set_exclude_directory".to_string());
}
}
impl Default for EmptyFolder {
fn default() -> Self {
Self::new()
}
}

View File

@ -58,7 +58,7 @@ fn main() {
)
.child(
TextBlock::new()
.text("Info:")
.text("Info:\n\n rr")
.v_align("center")
.h_align("start")
.attach(Grid::column(0))