1
0
Fork 0
mirror of synced 2024-06-01 18:19:46 +12:00

Merge github.com:qarmin/czkawka

This commit is contained in:
TheEvilSkeleton 2021-01-15 13:53:59 -05:00
commit 15fdd6be55
40 changed files with 2795 additions and 225 deletions

View file

@ -10,7 +10,7 @@ jobs:
matrix:
toolchain: [ stable ]
type: [ release ]
runs-on: ubuntu-latest
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v2
@ -28,7 +28,7 @@ jobs:
linux-cli-${{github.ref}}-${{github.sha}}
- name: Install basic libraries
run: sudo apt-get update; sudo apt install libgtk-3-dev -y
run: sudo apt-get update; sudo apt install libgtk-3-dev libasound2-dev -y
- name: Build CLI Debug
run: cargo build --bin czkawka_cli
@ -128,7 +128,7 @@ jobs:
matrix:
toolchain: [ stable ]
type: [ release ]
runs-on: ubuntu-latest
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@v2
@ -146,7 +146,7 @@ jobs:
linux-gui-${{github.ref}}-${{github.sha}}
- name: Install Gtk, Mingw, unzip, zip and wget
run: sudo apt-get update; sudo apt install libgtk-3-dev -y
run: sudo apt-get update; sudo apt install libgtk-3-dev libasound2-dev -y
- name: Build GUI Debug
run: cargo build --bin czkawka_gui
@ -192,7 +192,7 @@ jobs:
linux-appimage-gui-${{github.ref}}-${{github.sha}}
- name: Install Gtk,
run: sudo apt-get update; sudo apt install libgtk-3-dev librsvg2-dev wget -y
run: sudo apt-get update; sudo apt install libgtk-3-dev libasound2-dev librsvg2-dev wget -y
- name: Build GUI Release
run: cargo build --release --bin czkawka_gui

View file

@ -10,7 +10,7 @@ jobs:
matrix:
toolchain: [ stable ]
type: [ release ]
runs-on: macos-latest
runs-on: macos-10.15
steps:
- uses: actions/checkout@v2
@ -53,7 +53,7 @@ jobs:
matrix:
toolchain: [ stable ]
type: [ release ]
runs-on: macos-latest
runs-on: macos-10.15
steps:
- uses: actions/checkout@v2

View file

@ -25,7 +25,7 @@ jobs:
override: true
- name: Install Gtk
run: sudo apt install -y libgtk-3-dev
run: sudo apt install -y libgtk-3-dev libasound2-dev
- name: Check the format
run: cargo fmt --all -- --check

View file

@ -10,7 +10,7 @@ jobs:
matrix:
toolchain: [ stable ]
type: [ release ]
runs-on: windows-latest
runs-on: windows-2019
steps:
- uses: actions/checkout@v2
@ -149,7 +149,7 @@ jobs:
matrix:
toolchain: [ stable ]
type: [ release ]
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2

1330
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,3 +1,15 @@
## Version 2.3.0 - 15.01.2021r
- Add cache for duplicate finder - [#205](https://github.com/qarmin/czkawka/pull/205)
- Add cache for broken files - [#204](https://github.com/qarmin/czkawka/pull/204)
- Decrease ram usage - [#212](https://github.com/qarmin/czkawka/pull/212)
- Add support for finding broken zip and audio files - [#210](https://github.com/qarmin/czkawka/pull/210)
- Sort Results by path where it is possible - [#211](https://github.com/qarmin/czkawka/pull/211)
- Add missing popover info for invalid symlinks - [#209](https://github.com/qarmin/czkawka/pull/209)
- Use the oldest available OS in Linux and Mac CI and the newest on Windows - [#206](https://github.com/qarmin/czkawka/pull/206)
- Add broken files support - [#202](https://github.com/qarmin/czkawka/pull/202)
- Remove save workaround and fix crashes when loading/saving cache - [#200](https://github.com/qarmin/czkawka/pull/200)
- Fix error when closing dialog progress by X - [#199](https://github.com/qarmin/czkawka/pull/199)
## Version 2.2.0 - 11.01.2021r
- Adds Mac GUI - [#160](https://github.com/qarmin/czkawka/pull/160)
- Use master gtk plugin again - [#179](https://github.com/qarmin/czkawka/pull/179)

View file

@ -6,13 +6,11 @@
- Written in memory safe Rust
- Amazingly fast - due using more or less advanced algorithms and multithreading support
- Free, Open Source without ads
- Works on Linux, Windows and macOS
- Multiplatform - works on Linux, Windows and macOS
- Cache support - second and further scans should be a lot of faster than first
- CLI frontend, very fast to automate tasks
- GUI GTK frontend - uses modern GTK 3 and looks similar to FSlint
- Light/Dark theme match the appearance of the system(Linux only)
- Saving results to a file - allows reading entries found by the tool easily
- GUI frontend - uses modern GTK 3 and looks similar to FSlint
- Rich search option - allows setting absolute included and excluded directories, set of allowed file extensions or excluded items with * wildcard
- Image previews to get quick view at the compared photos
- Multiple tools to use:
- Duplicates - Finds duplicates basing on file name, size, hash, first 1 MB of hash
- Empty Folders - Finds empty folders with the help of advanced algorithm
@ -23,8 +21,9 @@
- Zeroed Files - Find files which are filled with zeros(usually corrupted)
- Same Music - Search for music with same artist, album etc.
- Invalid Symbolic Links - Shows symbolic links which points to non-existent files/directories
- Broken Files - Finds files with invalid extension or corrupted
![Czkawka](https://user-images.githubusercontent.com/41945903/100857797-69809680-348d-11eb-8382-acdec05fd3b8.gif)
![Czkawka](https://user-images.githubusercontent.com/41945903/104711404-9cbb7400-5721-11eb-904d-9677c189f7ab.gif)
## Instruction
You can find instruction how to use Czkawka [here](instructions/Instruction.md)
@ -49,7 +48,7 @@ If the app does not run when clicking at a launcher, run it through a terminal.
You don't need to have any additional libraries for CLI Czkawka
#### GUI Requirements
##### Linux
For Czkawka GUI you need to have at least GTK 3.22.
For Czkawka GUI you need to have at least GTK 3.22 and also Alsa installed(for finding broken music files).
It should be installed by default on all the most popular distros.
##### Windows
`czkawka_gui.exe` extracted from zip file `windows_czkawka_gui.zip` needs to have all files inside around, because use them.
@ -123,12 +122,12 @@ If you want to compile CLI frontend, then just skip lines which contains `gtk` w
```shell
sudo apt install -y curl # Needed by Rust update tool
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh # Download the latest stable Rust
sudo apt install -y libgtk-3-dev
sudo apt install -y libgtk-3-dev libasound2-dev
```
#### Fedora/CentOS/Rocky Linux
```shell
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh # Download the latest stable Rust
sudo yum install gtk3-devel glib2-devel
sudo yum install gtk3-devel glib2-devel alsa-lib-devel
```
#### MacOS
You need to install Homebrew and GTK Libraries
@ -219,6 +218,7 @@ So still is a big room for improvements.
| Zeroed Files| X | | |
| Music duplicates(tags) | X | | X |
| Invalid symlinks | X | X | |
| Broken Files | X | | |
| Installed packages | | X | |
| Invalid names | | X | |
| Names conflict | | X | |
@ -226,6 +226,7 @@ So still is a big room for improvements.
| Non stripped binaries | | X | |
| Redundant whitespace | | X | |
| Multiple languages(po) | | X | X |
| Cache support | X | | X |
| Project Activity | High | Very Low | High |
## Contributions

View file

@ -1,6 +1,6 @@
[package]
name = "czkawka_cli"
version = "2.2.0"
version = "2.3.0"
authors = ["Rafał Mikrut <mikrutrafal@protonmail.com>"]
edition = "2018"
description = "CLI frontend of Czkawka"

View file

@ -164,6 +164,23 @@ pub enum Commands {
#[structopt(flatten)]
not_recursive: NotRecursive,
},
#[structopt(name = "broken", about = "Finds broken files", help_message = HELP_MESSAGE, after_help = "EXAMPLE:\n czkawka broken -d /home/kicikici/ /home/szczek -e /home/kicikici/jestempsem -x jpg -f results.txt")]
BrokenFiles {
#[structopt(flatten)]
directories: Directories,
#[structopt(flatten)]
excluded_directories: ExcludedDirectories,
#[structopt(flatten)]
excluded_items: ExcludedItems,
#[structopt(flatten)]
allowed_extensions: AllowedExtensions,
#[structopt(short = "D", long, help = "Delete found files")]
delete_files: bool,
#[structopt(flatten)]
file_to_save: FileToSave,
#[structopt(flatten)]
not_recursive: NotRecursive,
},
}
#[derive(Debug, StructOpt)]
@ -190,7 +207,7 @@ pub struct AllowedExtensions {
short = "x",
long,
help = "Allowed file extension(s)",
long_help = "List of checked files with provided extension(s). There are also helpful macros which allow to easy use a typical extensions like:\nIMAGE(\"jpg,kra,gif,png,bmp,tiff,webp,hdr,svg\"),\nTEXT(\"txt,doc,docx,odt,rtf\"),\nVIDEO(\"mp4,flv,mkv,webm,vob,ogv,gifv,avi,mov,wmv,mpg,m4v,m4p,mpeg,3gp\") or\nMUSIC(\"mp3,flac,ogg,tta,wma,webm\")\n "
long_help = "List of checked files with provided extension(s). There are also helpful macros which allow to easy use a typical extensions like:\nIMAGE(\"jpg,kra,gif,png,bmp,tiff,hdr,svg\"),\nTEXT(\"txt,doc,docx,odt,rtf\"),\nVIDEO(\"mp4,flv,mkv,webm,vob,ogv,gifv,avi,mov,wmv,mpg,m4v,m4p,mpeg,3gp\") or\nMUSIC(\"mp3,flac,ogg,tta,wma,webm\")\n "
)]
pub allowed_extensions: Vec<String>,
}
@ -320,4 +337,5 @@ EXAMPLES:
{bin} image -d /home/rafal -e /home/rafal/Pulpit -f results.txt
{bin} zeroed -d /home/rafal -e /home/krzak -f results.txt"
{bin} music -d /home/rafal -e /home/rafal/Pulpit -z "artist,year, ARTISTALBUM, ALBUM___tiTlE" -f results.txt
{bin} symlinks -d /home/kicikici/ /home/szczek -e /home/kicikici/jestempsem -x jpg -f results.txt"#;
{bin} symlinks -d /home/kicikici/ /home/szczek -e /home/kicikici/jestempsem -x jpg -f results.txt
{bin} broken -d /home/mikrut/ -e /home/mikrut/trakt -f results.txt"#;

View file

@ -7,6 +7,7 @@ use czkawka_core::common_traits::*;
use czkawka_core::{
big_file::{self, BigFile},
broken_files::{self, BrokenFiles},
duplicate::DuplicateFinder,
empty_files::{self, EmptyFiles},
empty_folder::EmptyFolder,
@ -326,5 +327,39 @@ fn main() {
ifs.print_results();
ifs.get_text_messages().print_messages();
}
Commands::BrokenFiles {
directories,
excluded_directories,
excluded_items,
allowed_extensions,
delete_files,
file_to_save,
not_recursive,
} => {
let mut br = BrokenFiles::new();
br.set_included_directory(directories.directories);
br.set_excluded_directory(excluded_directories.excluded_directories);
br.set_excluded_items(excluded_items.excluded_items);
br.set_allowed_extensions(allowed_extensions.allowed_extensions.join(","));
br.set_recursive_search(!not_recursive.not_recursive);
if delete_files {
br.set_delete_method(broken_files::DeleteMethod::Delete);
}
br.find_broken_files(None, None);
if let Some(file_name) = file_to_save.file_name() {
if !br.save_results_to_file(file_name) {
br.get_text_messages().print_messages();
process::exit(1);
}
}
#[cfg(not(debug_assertions))] // This will show too much probably unnecessary data to debug, comment line only if needed
br.print_results();
br.get_text_messages().print_messages();
}
}
}

View file

@ -1,6 +1,6 @@
[package]
name = "czkawka_core"
version = "2.2.0"
version = "2.3.0"
authors = ["Rafał Mikrut <mikrutrafal@protonmail.com>"]
edition = "2018"
description = "Core of Czkawka app"
@ -29,4 +29,8 @@ bitflags = "1.2.1"
audiotags = "0.2.7182"
# Futures - needed by async progress sender
futures = "0.3.8"
futures = "0.3.9"
# Needed by broken files
zip = "0.5.9"
rodio = "0.13.0"

View file

@ -0,0 +1,678 @@
use std::fs::{File, Metadata, OpenOptions};
use std::io::prelude::*;
use std::path::{Path, PathBuf};
use std::time::{Duration, SystemTime, UNIX_EPOCH};
use std::{fs, thread};
use crate::common::Common;
use crate::common_directory::Directories;
use crate::common_extensions::Extensions;
use crate::common_items::ExcludedItems;
use crate::common_messages::Messages;
use crate::common_traits::*;
use crossbeam_channel::Receiver;
use directories_next::ProjectDirs;
use rayon::prelude::*;
use std::collections::HashMap;
use std::io::{BufReader, BufWriter};
use std::sync::atomic::{AtomicBool, AtomicUsize, Ordering};
use std::sync::Arc;
use std::thread::sleep;
const CACHE_FILE_NAME: &str = "cache_broken_files.txt";
#[derive(Debug)]
pub struct ProgressData {
pub current_stage: u8,
pub max_stage: u8,
pub files_checked: usize,
pub files_to_check: usize,
}
#[derive(Eq, PartialEq, Clone, Debug)]
pub enum DeleteMethod {
None,
Delete,
}
#[derive(Clone)]
pub struct FileEntry {
pub path: PathBuf,
pub modified_date: u64,
pub size: u64,
pub type_of_file: TypeOfFile,
pub error_string: String,
}
#[derive(Copy, Clone, PartialEq, Eq)]
pub enum TypeOfFile {
Unknown = -1,
Image = 0,
ArchiveZIP,
Audio,
}
/// Info struck with helpful information's about results
#[derive(Default)]
pub struct Info {
pub number_of_broken_files: usize,
pub number_of_removed_files: usize,
pub number_of_failed_to_remove_files: usize,
}
impl Info {
pub fn new() -> Self {
Default::default()
}
}
/// Struct with required information's to work
pub struct BrokenFiles {
text_messages: Messages,
information: Info,
files_to_check: HashMap<String, FileEntry>,
broken_files: Vec<FileEntry>,
directories: Directories,
allowed_extensions: Extensions,
excluded_items: ExcludedItems,
recursive_search: bool,
delete_method: DeleteMethod,
stopped_search: bool,
}
impl BrokenFiles {
pub fn new() -> Self {
Self {
text_messages: Messages::new(),
information: Info::new(),
recursive_search: true,
allowed_extensions: Extensions::new(),
directories: Directories::new(),
excluded_items: ExcludedItems::new(),
files_to_check: Default::default(),
delete_method: DeleteMethod::None,
stopped_search: false,
broken_files: Default::default(),
}
}
pub fn find_broken_files(&mut self, stop_receiver: Option<&Receiver<()>>, progress_sender: Option<&futures::channel::mpsc::Sender<ProgressData>>) {
self.directories.optimize_directories(self.recursive_search, &mut self.text_messages);
if !self.check_files(stop_receiver, progress_sender) {
self.stopped_search = true;
return;
}
if !self.look_for_broken_files(stop_receiver, progress_sender) {
self.stopped_search = true;
return;
}
self.delete_files();
self.debug_print();
}
pub fn get_stopped_search(&self) -> bool {
self.stopped_search
}
pub const fn get_broken_files(&self) -> &Vec<FileEntry> {
&self.broken_files
}
pub const fn get_text_messages(&self) -> &Messages {
&self.text_messages
}
pub const fn get_information(&self) -> &Info {
&self.information
}
pub fn set_delete_method(&mut self, delete_method: DeleteMethod) {
self.delete_method = delete_method;
}
pub fn set_recursive_search(&mut self, recursive_search: bool) {
self.recursive_search = recursive_search;
}
pub fn set_included_directory(&mut self, included_directory: Vec<PathBuf>) -> bool {
self.directories.set_included_directory(included_directory, &mut self.text_messages)
}
pub fn set_excluded_directory(&mut self, excluded_directory: Vec<PathBuf>) {
self.directories.set_excluded_directory(excluded_directory, &mut self.text_messages);
}
pub fn set_allowed_extensions(&mut self, allowed_extensions: String) {
self.allowed_extensions.set_allowed_extensions(allowed_extensions, &mut self.text_messages);
}
pub fn set_excluded_items(&mut self, excluded_items: Vec<String>) {
self.excluded_items.set_excluded_items(excluded_items, &mut self.text_messages);
}
fn check_files(&mut self, stop_receiver: Option<&Receiver<()>>, progress_sender: Option<&futures::channel::mpsc::Sender<ProgressData>>) -> bool {
let start_time: SystemTime = SystemTime::now();
let mut folders_to_check: Vec<PathBuf> = Vec::with_capacity(1024 * 2); // This should be small enough too not see to big difference and big enough to store most of paths without needing to resize vector
// Add root folders for finding
for id in &self.directories.included_directories {
folders_to_check.push(id.clone());
}
//// PROGRESS THREAD START
const LOOP_DURATION: u32 = 200; //in ms
let progress_thread_run = Arc::new(AtomicBool::new(true));
let atomic_file_counter = Arc::new(AtomicUsize::new(0));
let progress_thread_handle;
if let Some(progress_sender) = progress_sender {
let mut progress_send = progress_sender.clone();
let progress_thread_run = progress_thread_run.clone();
let atomic_file_counter = atomic_file_counter.clone();
progress_thread_handle = thread::spawn(move || loop {
progress_send
.try_send(ProgressData {
current_stage: 0,
max_stage: 1,
files_checked: atomic_file_counter.load(Ordering::Relaxed) as usize,
files_to_check: 0,
})
.unwrap();
if !progress_thread_run.load(Ordering::Relaxed) {
break;
}
sleep(Duration::from_millis(LOOP_DURATION as u64));
});
} else {
progress_thread_handle = thread::spawn(|| {});
}
//// PROGRESS THREAD END
while !folders_to_check.is_empty() {
if stop_receiver.is_some() && stop_receiver.unwrap().try_recv().is_ok() {
// End thread which send info to gui
progress_thread_run.store(false, Ordering::Relaxed);
progress_thread_handle.join().unwrap();
return false;
}
let current_folder = folders_to_check.pop().unwrap();
// Read current dir, if permission are denied just go to next
let read_dir = match fs::read_dir(&current_folder) {
Ok(t) => t,
Err(_) => {
self.text_messages.warnings.push(format!("Cannot open dir {}", current_folder.display()));
continue;
} // Permissions denied
};
// Check every sub folder/file/link etc.
'dir: for entry in read_dir {
let entry_data = match entry {
Ok(t) => t,
Err(_) => {
self.text_messages.warnings.push(format!("Cannot read entry in dir {}", current_folder.display()));
continue;
} //Permissions denied
};
let metadata: Metadata = match entry_data.metadata() {
Ok(t) => t,
Err(_) => {
self.text_messages.warnings.push(format!("Cannot read metadata in dir {}", current_folder.display()));
continue;
} //Permissions denied
};
if metadata.is_dir() {
if !self.recursive_search {
continue;
}
let next_folder = current_folder.join(entry_data.file_name());
if self.directories.is_excluded(&next_folder) || self.excluded_items.is_excluded(&next_folder) {
continue 'dir;
}
folders_to_check.push(next_folder);
} else if metadata.is_file() {
atomic_file_counter.fetch_add(1, Ordering::Relaxed);
let file_name_lowercase: String = match entry_data.file_name().into_string() {
Ok(t) => t,
Err(_) => continue,
}
.to_lowercase();
let type_of_file = check_extension_avaibility(&file_name_lowercase);
if type_of_file == TypeOfFile::Unknown {
continue 'dir;
}
// Checking allowed extensions
if !self.allowed_extensions.file_extensions.is_empty() {
let allowed = self.allowed_extensions.file_extensions.iter().any(|e| file_name_lowercase.ends_with((".".to_string() + e.to_lowercase().as_str()).as_str()));
if !allowed {
// Not an allowed extension, ignore it.
continue 'dir;
}
}
// Checking files
let current_file_name = current_folder.join(entry_data.file_name());
if self.excluded_items.is_excluded(&current_file_name) {
continue 'dir;
}
// Creating new file entry
let fe: FileEntry = FileEntry {
path: current_file_name.clone(),
modified_date: match metadata.modified() {
Ok(t) => match t.duration_since(UNIX_EPOCH) {
Ok(d) => d.as_secs(),
Err(_) => {
self.text_messages.warnings.push(format!("File {} seems to be modified before Unix Epoch.", current_file_name.display()));
0
}
},
Err(_) => {
self.text_messages.warnings.push(format!("Unable to get modification date from file {}", current_file_name.display()));
continue;
} // Permissions Denied
},
size: metadata.len(),
type_of_file,
error_string: "".to_string(),
};
// Adding files to Vector
self.files_to_check.insert(fe.path.to_string_lossy().to_string(), fe);
}
}
}
// End thread which send info to gui
progress_thread_run.store(false, Ordering::Relaxed);
progress_thread_handle.join().unwrap();
Common::print_time(start_time, SystemTime::now(), "check_files".to_string());
true
}
fn look_for_broken_files(&mut self, stop_receiver: Option<&Receiver<()>>, progress_sender: Option<&futures::channel::mpsc::Sender<ProgressData>>) -> bool {
let system_time = SystemTime::now();
let loaded_hash_map = match load_cache_from_file(&mut self.text_messages) {
Some(t) => t,
None => Default::default(),
};
let mut records_already_cached: HashMap<String, FileEntry> = Default::default();
let mut non_cached_files_to_check: HashMap<String, FileEntry> = Default::default();
for (name, file_entry) in &self.files_to_check {
#[allow(clippy::collapsible_if)]
if !loaded_hash_map.contains_key(name) {
// If loaded data doesn't contains current image info
non_cached_files_to_check.insert(name.clone(), file_entry.clone());
} else {
if file_entry.size != loaded_hash_map.get(name).unwrap().size || file_entry.modified_date != loaded_hash_map.get(name).unwrap().modified_date {
// When size or modification date of image changed, then it is clear that is different image
non_cached_files_to_check.insert(name.clone(), file_entry.clone());
} else {
// Checking may be omitted when already there is entry with same size and modification date
records_already_cached.insert(name.clone(), loaded_hash_map.get(name).unwrap().clone());
}
}
}
let check_was_breaked = AtomicBool::new(false); // Used for breaking from GUI and ending check thread
//// PROGRESS THREAD START
const LOOP_DURATION: u32 = 200; //in ms
let progress_thread_run = Arc::new(AtomicBool::new(true));
let atomic_file_counter = Arc::new(AtomicUsize::new(0));
let progress_thread_handle;
if let Some(progress_sender) = progress_sender {
let mut progress_send = progress_sender.clone();
let progress_thread_run = progress_thread_run.clone();
let atomic_file_counter = atomic_file_counter.clone();
let files_to_check = non_cached_files_to_check.len();
progress_thread_handle = thread::spawn(move || loop {
progress_send
.try_send(ProgressData {
current_stage: 1,
max_stage: 1,
files_checked: atomic_file_counter.load(Ordering::Relaxed) as usize,
files_to_check,
})
.unwrap();
if !progress_thread_run.load(Ordering::Relaxed) {
break;
}
sleep(Duration::from_millis(LOOP_DURATION as u64));
});
} else {
progress_thread_handle = thread::spawn(|| {});
}
//// PROGRESS THREAD END
let mut vec_file_entry: Vec<FileEntry> = non_cached_files_to_check
.par_iter()
.map(|file_entry| {
atomic_file_counter.fetch_add(1, Ordering::Relaxed);
if stop_receiver.is_some() && stop_receiver.unwrap().try_recv().is_ok() {
check_was_breaked.store(true, Ordering::Relaxed);
return None;
}
let file_entry = file_entry.1;
match file_entry.type_of_file {
TypeOfFile::Image => {
match image::open(&file_entry.path) {
Ok(_) => Some(None),
Err(t) => {
let error_string = t.to_string();
// This error is a problem with image library, remove check when https://github.com/image-rs/jpeg-decoder/issues/130 will be fixed
if !error_string.contains("spectral selection is not allowed in non-progressive scan") {
let mut file_entry = file_entry.clone();
file_entry.error_string = error_string;
Some(Some(file_entry))
} else {
Some(None)
}
} // Something is wrong with image
}
}
TypeOfFile::ArchiveZIP => match fs::File::open(&file_entry.path) {
Ok(file) => match zip::ZipArchive::new(file) {
Ok(_) => Some(None),
Err(e) => {
// TODO Maybe filter out unnecessary types of errors
let error_string = e.to_string();
let mut file_entry = file_entry.clone();
file_entry.error_string = error_string;
Some(Some(file_entry))
}
},
Err(_) => Some(None),
},
TypeOfFile::Audio => match fs::File::open(&file_entry.path) {
Ok(file) => match rodio::Decoder::new(BufReader::new(file)) {
Ok(_) => Some(None),
Err(e) => {
let error_string = e.to_string();
let mut file_entry = file_entry.clone();
file_entry.error_string = error_string;
Some(Some(file_entry))
}
},
Err(_) => Some(None),
},
// This means that cache read invalid value because maybe cache comes from different czkawka version
TypeOfFile::Unknown => Some(None),
}
})
.while_some()
.filter(|file_entry| file_entry.is_some())
.map(|file_entry| file_entry.unwrap())
.collect::<Vec<FileEntry>>();
// End thread which send info to gui
progress_thread_run.store(false, Ordering::Relaxed);
progress_thread_handle.join().unwrap();
// Break if stop was clicked
if check_was_breaked.load(Ordering::Relaxed) {
return false;
}
// Just connect loaded results with already calculated
for (_name, file_entry) in records_already_cached {
vec_file_entry.push(file_entry.clone());
}
self.broken_files = vec_file_entry.iter().filter_map(|f| if f.error_string.is_empty() { None } else { Some(f.clone()) }).collect();
// Must save all results to file, old loaded from file with all currently counted results
let mut all_results: HashMap<String, FileEntry> = self.files_to_check.clone();
for file_entry in vec_file_entry {
all_results.insert(file_entry.path.to_string_lossy().to_string(), file_entry);
}
for (_name, file_entry) in loaded_hash_map {
all_results.insert(file_entry.path.to_string_lossy().to_string(), file_entry);
}
save_cache_to_file(&all_results, &mut self.text_messages);
self.information.number_of_broken_files = self.broken_files.len();
Common::print_time(system_time, SystemTime::now(), "sort_images - reading data from files in parallel".to_string());
// Clean unused data
self.files_to_check = Default::default();
true
}
/// Function to delete files, from filed Vector
fn delete_files(&mut self) {
let start_time: SystemTime = SystemTime::now();
match self.delete_method {
DeleteMethod::Delete => {
for file_entry in self.broken_files.iter() {
if fs::remove_file(&file_entry.path).is_err() {
self.text_messages.warnings.push(file_entry.path.display().to_string());
}
}
}
DeleteMethod::None => {
//Just do nothing
}
}
Common::print_time(start_time, SystemTime::now(), "delete_files".to_string());
}
}
impl Default for BrokenFiles {
fn default() -> Self {
Self::new()
}
}
impl DebugPrint for BrokenFiles {
#[allow(dead_code)]
#[allow(unreachable_code)]
/// Debugging printing - only available on debug build
fn debug_print(&self) {
#[cfg(not(debug_assertions))]
{
return;
}
println!("---------------DEBUG PRINT---------------");
println!("### Information's");
println!("Errors size - {}", self.text_messages.errors.len());
println!("Warnings size - {}", self.text_messages.warnings.len());
println!("Messages size - {}", self.text_messages.messages.len());
println!("Number of removed files - {}", self.information.number_of_removed_files);
println!("Number of failed to remove files - {}", self.information.number_of_failed_to_remove_files);
println!("### Other");
println!("Allowed extensions - {:?}", self.allowed_extensions.file_extensions);
println!("Excluded items - {:?}", self.excluded_items.items);
println!("Included directories - {:?}", self.directories.included_directories);
println!("Excluded directories - {:?}", self.directories.excluded_directories);
println!("Recursive search - {}", self.recursive_search.to_string());
println!("Delete Method - {:?}", self.delete_method);
println!("-----------------------------------------");
}
}
impl SaveResults for BrokenFiles {
fn save_results_to_file(&mut self, file_name: &str) -> bool {
let start_time: SystemTime = SystemTime::now();
let file_name: String = match file_name {
"" => "results.txt".to_string(),
k => k.to_string(),
};
let file_handler = match File::create(&file_name) {
Ok(t) => t,
Err(_) => {
self.text_messages.errors.push(format!("Failed to create file {}", file_name));
return false;
}
};
let mut writer = BufWriter::new(file_handler);
if writeln!(
writer,
"Results of searching {:?} with excluded directories {:?} and excluded items {:?}",
self.directories.included_directories, self.directories.excluded_directories, self.excluded_items.items
)
.is_err()
{
self.text_messages.errors.push(format!("Failed to save results to file {}", file_name));
return false;
}
if !self.broken_files.is_empty() {
writeln!(writer, "Found {} broken files.", self.information.number_of_broken_files).unwrap();
for file_entry in self.broken_files.iter() {
writeln!(writer, "{} - {}", file_entry.path.display(), file_entry.error_string).unwrap();
}
} else {
write!(writer, "Not found any broken files.").unwrap();
}
Common::print_time(start_time, SystemTime::now(), "save_results_to_file".to_string());
true
}
}
impl PrintResults for BrokenFiles {
/// Print information's about duplicated entries
/// Only needed for CLI
fn print_results(&self) {
let start_time: SystemTime = SystemTime::now();
println!("Found {} broken files.\n", self.information.number_of_broken_files);
for file_entry in self.broken_files.iter() {
println!("{} - {}", file_entry.path.display(), file_entry.error_string);
}
Common::print_time(start_time, SystemTime::now(), "print_entries".to_string());
}
}
fn save_cache_to_file(hashmap_file_entry: &HashMap<String, FileEntry>, text_messages: &mut Messages) {
if let Some(proj_dirs) = ProjectDirs::from("pl", "Qarmin", "Czkawka") {
// Lin: /home/username/.cache/czkawka
// Win: C:\Users\Username\AppData\Local\Qarmin\Czkawka\cache
// Mac: /Users/Username/Library/Caches/pl.Qarmin.Czkawka
let cache_dir = PathBuf::from(proj_dirs.cache_dir());
if cache_dir.exists() {
if !cache_dir.is_dir() {
text_messages.messages.push(format!("Config dir {} is a file!", cache_dir.display()));
return;
}
} else if fs::create_dir_all(&cache_dir).is_err() {
text_messages.messages.push(format!("Cannot create config dir {}", cache_dir.display()));
return;
}
let cache_file = cache_dir.join(CACHE_FILE_NAME);
let file_handler = match OpenOptions::new().truncate(true).write(true).create(true).open(&cache_file) {
Ok(t) => t,
Err(_) => {
text_messages.messages.push(format!("Cannot create or open cache file {}", cache_file.display()));
return;
}
};
let mut writer = BufWriter::new(file_handler);
for file_entry in hashmap_file_entry.values() {
// Only save to cache files which have more than 1KB
if file_entry.size > 1024 {
let string: String = format!("{}//{}//{}//{}", file_entry.path.display(), file_entry.size, file_entry.modified_date, file_entry.error_string);
if writeln!(writer, "{}", string).is_err() {
text_messages.messages.push(format!("Failed to save some data to cache file {}", cache_file.display()));
return;
};
}
}
}
}
fn load_cache_from_file(text_messages: &mut Messages) -> Option<HashMap<String, FileEntry>> {
if let Some(proj_dirs) = ProjectDirs::from("pl", "Qarmin", "Czkawka") {
let cache_dir = PathBuf::from(proj_dirs.cache_dir());
let cache_file = cache_dir.join(CACHE_FILE_NAME);
let file_handler = match OpenOptions::new().read(true).open(&cache_file) {
Ok(t) => t,
Err(_) => {
// text_messages.messages.push(format!("Cannot find or open cache file {}", cache_file.display())); // This shouldn't be write to output
return None;
}
};
let reader = BufReader::new(file_handler);
let mut hashmap_loaded_entries: HashMap<String, FileEntry> = Default::default();
// Read the file line by line using the lines() iterator from std::io::BufRead.
for (index, line) in reader.lines().enumerate() {
let line = match line {
Ok(t) => t,
Err(_) => {
text_messages.warnings.push(format!("Failed to load line number {} from cache file {}", index + 1, cache_file.display()));
return None;
}
};
let uuu = line.split("//").collect::<Vec<&str>>();
if uuu.len() != 4 {
text_messages.warnings.push(format!("Found invalid data in line {} - ({}) in cache file {}", index + 1, line, cache_file.display()));
continue;
}
// Don't load cache data if destination file not exists
if Path::new(uuu[0]).exists() {
hashmap_loaded_entries.insert(
uuu[0].to_string(),
FileEntry {
path: PathBuf::from(uuu[0]),
size: match uuu[1].parse::<u64>() {
Ok(t) => t,
Err(_) => {
text_messages.warnings.push(format!("Found invalid size value in line {} - ({}) in cache file {}", index + 1, line, cache_file.display()));
continue;
}
},
modified_date: match uuu[2].parse::<u64>() {
Ok(t) => t,
Err(_) => {
text_messages.warnings.push(format!("Found invalid modified date value in line {} - ({}) in cache file {}", index + 1, line, cache_file.display()));
continue;
}
},
type_of_file: check_extension_avaibility(&uuu[0].to_lowercase()),
error_string: uuu[3].to_string(),
},
);
}
}
return Some(hashmap_loaded_entries);
}
text_messages.messages.push("Cannot find or open system config dir to save cache file".to_string());
None
}
fn check_extension_avaibility(file_name_lowercase: &str) -> TypeOfFile {
// Checking allowed image extensions
let allowed_image_extensions = [".jpg", ".jpeg", ".png", ".bmp", ".ico", ".tiff", ".pnm", ".tga", ".ff", ".gif"];
let allowed_archive_zip_extensions = [".zip"]; // Probably also should work [".xz", ".bz2"], but from my tests they not working
let allowed_audio_extensions = [".mp3", ".flac", ".wav", ".ogg"]; // Probably also should work [".xz", ".bz2"], but from my tests they not working
if allowed_image_extensions.iter().any(|e| file_name_lowercase.ends_with(e)) {
TypeOfFile::Image
} else if allowed_archive_zip_extensions.iter().any(|e| file_name_lowercase.ends_with(e)) {
TypeOfFile::ArchiveZIP
} else if allowed_audio_extensions.iter().any(|e| file_name_lowercase.ends_with(e)) {
TypeOfFile::Audio
} else {
TypeOfFile::Unknown
}
}

View file

@ -18,7 +18,7 @@ impl Extensions {
if allowed_extensions.is_empty() {
return;
}
allowed_extensions = allowed_extensions.replace("IMAGE", "jpg,kra,gif,png,bmp,tiff,webp,hdr,svg");
allowed_extensions = allowed_extensions.replace("IMAGE", "jpg,kra,gif,png,bmp,tiff,hdr,svg");
allowed_extensions = allowed_extensions.replace("VIDEO", "mp4,flv,mkv,webm,vob,ogv,gifv,avi,mov,wmv,mpg,m4v,m4p,mpeg,3gp");
allowed_extensions = allowed_extensions.replace("MUSIC", "mp3,flac,ogg,tta,wma,webm");
allowed_extensions = allowed_extensions.replace("TEXT", "txt,doc,docx,odt,rtf");

View file

@ -1,9 +1,9 @@
use crossbeam_channel::Receiver;
use humansize::{file_size_opts as options, FileSize};
use std::collections::{BTreeMap, HashMap};
use std::fs::{File, Metadata};
use std::fs::{File, Metadata, OpenOptions};
use std::io::prelude::*;
use std::path::PathBuf;
use std::path::{Path, PathBuf};
use std::time::{Duration, SystemTime, UNIX_EPOCH};
use std::{fs, thread};
@ -13,14 +13,17 @@ use crate::common_extensions::Extensions;
use crate::common_items::ExcludedItems;
use crate::common_messages::Messages;
use crate::common_traits::*;
use directories_next::ProjectDirs;
use rayon::prelude::*;
use std::io::BufWriter;
use std::io::{BufReader, BufWriter};
use std::sync::atomic::{AtomicBool, AtomicUsize, Ordering};
use std::sync::Arc;
use std::thread::sleep;
const HASH_MB_LIMIT_BYTES: u64 = 1024 * 1024; // 1MB
const CACHE_FILE_NAME: &str = "cache_duplicates.txt";
#[derive(Debug)]
pub struct ProgressData {
pub checking_method: CheckingMethod,
@ -39,7 +42,7 @@ pub enum CheckingMethod {
HashMB,
}
#[derive(PartialEq, Eq, Clone, Debug)]
#[derive(PartialEq, Eq, Clone, Debug, Copy)]
pub enum HashType {
Blake3,
}
@ -58,6 +61,7 @@ pub struct FileEntry {
pub path: PathBuf,
pub size: u64,
pub modified_date: u64,
pub hash: String,
}
/// Info struck with helpful information's about results
@ -349,6 +353,7 @@ impl DuplicateFinder {
continue 'dir;
} // Permissions Denied
},
hash: "".to_string(),
};
// Adding files to BTreeMap
@ -520,6 +525,7 @@ impl DuplicateFinder {
continue 'dir;
} // Permissions Denied
},
hash: "".to_string(),
};
// Adding files to BTreeMap
@ -631,8 +637,8 @@ impl DuplicateFinder {
hasher.update(&buffer[..n]);
let hash_string: String = hasher.finalize().to_hex().to_string();
hashmap_with_hash.entry(hash_string.to_string()).or_insert_with(Vec::new);
hashmap_with_hash.get_mut(hash_string.as_str()).unwrap().push(file_entry.to_owned());
hashmap_with_hash.entry(hash_string.clone()).or_insert_with(Vec::new);
hashmap_with_hash.get_mut(hash_string.as_str()).unwrap().push(file_entry.clone());
}
Some((*size, hashmap_with_hash, errors, bytes_read))
})
@ -700,60 +706,191 @@ impl DuplicateFinder {
//// PROGRESS THREAD END
#[allow(clippy::type_complexity)]
let full_hash_results: Vec<(u64, HashMap<String, Vec<FileEntry>>, Vec<String>, u64)> = pre_checked_map
.par_iter()
.map(|(size, vec_file_entry)| {
let mut hashmap_with_hash: HashMap<String, Vec<FileEntry>> = Default::default();
let mut errors: Vec<String> = Vec::new();
let mut file_handler: File;
let mut bytes_read: u64 = 0;
atomic_file_counter.fetch_add(vec_file_entry.len(), Ordering::Relaxed);
'fe: for file_entry in vec_file_entry {
if stop_receiver.is_some() && stop_receiver.unwrap().try_recv().is_ok() {
check_was_breaked.store(true, Ordering::Relaxed);
return None;
}
file_handler = match File::open(&file_entry.path) {
Ok(t) => t,
Err(_) => {
errors.push(format!("Unable to check hash of file {}", file_entry.path.display()));
continue 'fe;
}
};
let mut full_hash_results: Vec<(u64, HashMap<String, Vec<FileEntry>>, Vec<String>, u64)>;
let mut hasher: blake3::Hasher = blake3::Hasher::new();
let mut buffer = [0u8; 1024 * 32];
let mut current_file_read_bytes: u64 = 0;
loop {
let n = match file_handler.read(&mut buffer) {
Ok(t) => t,
Err(_) => {
errors.push(format!("Error happened when checking hash of file {}", file_entry.path.display()));
continue 'fe;
match self.check_method {
CheckingMethod::HashMB => {
full_hash_results = pre_checked_map
.par_iter()
.map(|(size, vec_file_entry)| {
let mut hashmap_with_hash: HashMap<String, Vec<FileEntry>> = Default::default();
let mut errors: Vec<String> = Vec::new();
let mut file_handler: File;
let mut bytes_read: u64 = 0;
atomic_file_counter.fetch_add(vec_file_entry.len(), Ordering::Relaxed);
'fe: for file_entry in vec_file_entry {
if stop_receiver.is_some() && stop_receiver.unwrap().try_recv().is_ok() {
check_was_breaked.store(true, Ordering::Relaxed);
return None;
}
};
if n == 0 {
break;
file_handler = match File::open(&file_entry.path) {
Ok(t) => t,
Err(_) => {
errors.push(format!("Unable to check hash of file {}", file_entry.path.display()));
continue 'fe;
}
};
let mut hasher: blake3::Hasher = blake3::Hasher::new();
let mut buffer = [0u8; 1024 * 128];
let mut current_file_read_bytes: u64 = 0;
loop {
let n = match file_handler.read(&mut buffer) {
Ok(t) => t,
Err(_) => {
errors.push(format!("Error happened when checking hash of file {}", file_entry.path.display()));
continue 'fe;
}
};
if n == 0 {
break;
}
current_file_read_bytes += n as u64;
bytes_read += n as u64;
hasher.update(&buffer[..n]);
if current_file_read_bytes >= HASH_MB_LIMIT_BYTES {
break;
}
}
let hash_string: String = hasher.finalize().to_hex().to_string();
hashmap_with_hash.entry(hash_string.to_string()).or_insert_with(Vec::new);
hashmap_with_hash.get_mut(hash_string.as_str()).unwrap().push(file_entry.to_owned());
}
Some((*size, hashmap_with_hash, errors, bytes_read))
})
.while_some()
.collect();
}
CheckingMethod::Hash => {
let loaded_hash_map = match load_hashes_from_file(&mut self.text_messages, &self.hash_type) {
Some(t) => t,
None => Default::default(),
};
current_file_read_bytes += n as u64;
bytes_read += n as u64;
hasher.update(&buffer[..n]);
let mut records_already_cached: HashMap<u64, Vec<FileEntry>> = Default::default();
let mut non_cached_files_to_check: HashMap<u64, Vec<FileEntry>> = Default::default();
for (size, vec_file_entry) in pre_checked_map {
#[allow(clippy::collapsible_if)]
if !loaded_hash_map.contains_key(&size) {
// If loaded data doesn't contains current info
non_cached_files_to_check.insert(size, vec_file_entry);
} else {
let loaded_vec_file_entry = loaded_hash_map.get(&size).unwrap();
if self.check_method == CheckingMethod::HashMB && current_file_read_bytes >= HASH_MB_LIMIT_BYTES {
break;
for file_entry in vec_file_entry {
let mut found: bool = false;
for loaded_file_entry in loaded_vec_file_entry {
if file_entry.path == loaded_file_entry.path && file_entry.modified_date == loaded_file_entry.modified_date {
records_already_cached.entry(file_entry.size).or_insert_with(Vec::new);
records_already_cached.get_mut(&file_entry.size).unwrap().push(loaded_file_entry.clone());
found = true;
break;
}
}
if !found {
non_cached_files_to_check.entry(file_entry.size).or_insert_with(Vec::new);
non_cached_files_to_check.get_mut(&file_entry.size).unwrap().push(file_entry);
}
}
}
let hash_string: String = hasher.finalize().to_hex().to_string();
hashmap_with_hash.entry(hash_string.to_string()).or_insert_with(Vec::new);
hashmap_with_hash.get_mut(hash_string.as_str()).unwrap().push(file_entry.to_owned());
}
Some((*size, hashmap_with_hash, errors, bytes_read))
})
.while_some()
.collect();
full_hash_results = non_cached_files_to_check
.par_iter()
.map(|(size, vec_file_entry)| {
let mut hashmap_with_hash: HashMap<String, Vec<FileEntry>> = Default::default();
let mut errors: Vec<String> = Vec::new();
let mut file_handler: File;
let mut bytes_read: u64 = 0;
atomic_file_counter.fetch_add(vec_file_entry.len(), Ordering::Relaxed);
'fe: for file_entry in vec_file_entry {
if stop_receiver.is_some() && stop_receiver.unwrap().try_recv().is_ok() {
check_was_breaked.store(true, Ordering::Relaxed);
return None;
}
file_handler = match File::open(&file_entry.path) {
Ok(t) => t,
Err(_) => {
errors.push(format!("Unable to check hash of file {}", file_entry.path.display()));
continue 'fe;
}
};
let mut hasher: blake3::Hasher = blake3::Hasher::new();
let mut buffer = [0u8; 1024 * 128];
loop {
let n = match file_handler.read(&mut buffer) {
Ok(t) => t,
Err(_) => {
errors.push(format!("Error happened when checking hash of file {}", file_entry.path.display()));
continue 'fe;
}
};
if n == 0 {
break;
}
bytes_read += n as u64;
hasher.update(&buffer[..n]);
}
let hash_string: String = hasher.finalize().to_hex().to_string();
let mut file_entry = file_entry.clone();
file_entry.hash = hash_string.clone();
hashmap_with_hash.entry(hash_string.clone()).or_insert_with(Vec::new);
hashmap_with_hash.get_mut(hash_string.as_str()).unwrap().push(file_entry);
}
Some((*size, hashmap_with_hash, errors, bytes_read))
})
.while_some()
.collect();
// Size, Vec
'main: for (size, vec_file_entry) in records_already_cached {
// Check if size already exists, if exists we must to change it outside because cannot have mut and non mut reference to full_hash_results
for (full_size, full_hashmap, _errors, _bytes_read) in &mut full_hash_results {
if size == *full_size {
for file_entry in vec_file_entry {
full_hashmap.entry(file_entry.hash.clone()).or_insert_with(Vec::new);
full_hashmap.get_mut(&file_entry.hash).unwrap().push(file_entry);
}
continue 'main;
}
}
// Size doesn't exists add results to files
let mut temp_hashmap: HashMap<String, Vec<FileEntry>> = Default::default();
for file_entry in vec_file_entry {
temp_hashmap.entry(file_entry.hash.clone()).or_insert_with(Vec::new);
temp_hashmap.get_mut(&file_entry.hash).unwrap().push(file_entry);
}
full_hash_results.push((size, temp_hashmap, Vec::new(), 0));
}
// Must save all results to file, old loaded from file with all currently counted results
let mut all_results: HashMap<String, FileEntry> = Default::default();
for (_size, vec_file_entry) in loaded_hash_map {
for file_entry in vec_file_entry {
all_results.insert(file_entry.path.to_string_lossy().to_string(), file_entry);
}
}
for (_size, hashmap, _errors, _bytes_read) in &full_hash_results {
for vec_file_entry in hashmap.values() {
for file_entry in vec_file_entry {
all_results.insert(file_entry.path.to_string_lossy().to_string(), file_entry.clone());
}
}
}
save_hashes_to_file(&all_results, &mut self.text_messages, &self.hash_type);
}
_ => panic!("What"),
}
// End thread which send info to gui
progress_thread_run.store(false, Ordering::Relaxed);
@ -786,6 +923,10 @@ impl DuplicateFinder {
}
Common::print_time(start_time, SystemTime::now(), "check_files_hash - full hash".to_string());
// Clean unused data
self.files_with_identical_size = Default::default();
true
}
@ -1169,3 +1310,104 @@ fn delete_files(vector: &[FileEntry], delete_method: &DeleteMethod, warnings: &m
};
(gained_space, removed_files, failed_to_remove_files)
}
fn save_hashes_to_file(hashmap: &HashMap<String, FileEntry>, text_messages: &mut Messages, type_of_hash: &HashType) {
println!("Trying to save {} files", hashmap.len());
if let Some(proj_dirs) = ProjectDirs::from("pl", "Qarmin", "Czkawka") {
let cache_dir = PathBuf::from(proj_dirs.cache_dir());
if cache_dir.exists() {
if !cache_dir.is_dir() {
text_messages.messages.push(format!("Config dir {} is a file!", cache_dir.display()));
return;
}
} else if fs::create_dir_all(&cache_dir).is_err() {
text_messages.messages.push(format!("Cannot create config dir {}", cache_dir.display()));
return;
}
let cache_file = cache_dir.join(CACHE_FILE_NAME.replace(".", format!("_{:?}.", type_of_hash).as_str()));
let file_handler = match OpenOptions::new().truncate(true).write(true).create(true).open(&cache_file) {
Ok(t) => t,
Err(_) => {
text_messages.messages.push(format!("Cannot create or open cache file {}", cache_file.display()));
return;
}
};
let mut writer = BufWriter::new(file_handler);
for file_entry in hashmap.values() {
// Only cache bigger than 5MB files
if file_entry.size > 5 * 1024 * 1024 {
let string: String = format!("{}//{}//{}//{}", file_entry.path.display(), file_entry.size, file_entry.modified_date, file_entry.hash);
if writeln!(writer, "{}", string).is_err() {
text_messages.messages.push(format!("Failed to save some data to cache file {}", cache_file.display()));
return;
};
}
}
}
}
fn load_hashes_from_file(text_messages: &mut Messages, type_of_hash: &HashType) -> Option<BTreeMap<u64, Vec<FileEntry>>> {
if let Some(proj_dirs) = ProjectDirs::from("pl", "Qarmin", "Czkawka") {
let cache_dir = PathBuf::from(proj_dirs.cache_dir());
let cache_file = cache_dir.join(CACHE_FILE_NAME.replace(".", format!("_{:?}.", type_of_hash).as_str()));
let file_handler = match OpenOptions::new().read(true).open(&cache_file) {
Ok(t) => t,
Err(_) => {
// text_messages.messages.push(format!("Cannot find or open cache file {}", cache_file.display())); // This shouldn't be write to output
return None;
}
};
let reader = BufReader::new(file_handler);
let mut hashmap_loaded_entries: BTreeMap<u64, Vec<FileEntry>> = Default::default();
// Read the file line by line using the lines() iterator from std::io::BufRead.
for (index, line) in reader.lines().enumerate() {
let line = match line {
Ok(t) => t,
Err(_) => {
text_messages.warnings.push(format!("Failed to load line number {} from cache file {}", index + 1, cache_file.display()));
return None;
}
};
let uuu = line.split("//").collect::<Vec<&str>>();
if uuu.len() != 4 {
text_messages
.warnings
.push(format!("Found invalid data(too much or too low amount of data) in line {} - ({}) in cache file {}", index + 1, line, cache_file.display()));
continue;
}
// Don't load cache data if destination file not exists
if Path::new(uuu[0]).exists() {
let file_entry = FileEntry {
path: PathBuf::from(uuu[0]),
size: match uuu[1].parse::<u64>() {
Ok(t) => t,
Err(_) => {
text_messages.warnings.push(format!("Found invalid size value in line {} - ({}) in cache file {}", index + 1, line, cache_file.display()));
continue;
}
},
modified_date: match uuu[2].parse::<u64>() {
Ok(t) => t,
Err(_) => {
text_messages.warnings.push(format!("Found invalid modified date value in line {} - ({}) in cache file {}", index + 1, line, cache_file.display()));
continue;
}
},
hash: uuu[3].to_string(),
};
hashmap_loaded_entries.entry(file_entry.size).or_insert_with(Vec::new);
hashmap_loaded_entries.get_mut(&file_entry.size).unwrap().push(file_entry);
}
}
return Some(hashmap_loaded_entries);
}
text_messages.messages.push("Cannot find or open system config dir to save cache file".to_string());
None
}

View file

@ -2,10 +2,15 @@
extern crate bitflags;
pub mod big_file;
pub mod broken_files;
pub mod duplicate;
pub mod empty_files;
pub mod empty_folder;
pub mod invalid_symlinks;
pub mod same_music;
pub mod similar_images;
pub mod temporary;
pub mod zeroed;
pub mod common;
pub mod common_directory;
@ -13,9 +18,5 @@ pub mod common_extensions;
pub mod common_items;
pub mod common_messages;
pub mod common_traits;
pub mod invalid_symlinks;
pub mod same_music;
pub mod similar_images;
pub mod zeroed;
pub const CZKAWKA_VERSION: &str = env!("CARGO_PKG_VERSION");

View file

@ -402,6 +402,10 @@ impl SameMusic {
self.music_entries = vec_file_entry;
Common::print_time(start_time, SystemTime::now(), "check_records_multithreaded".to_string());
// Clean for duplicate files
self.music_to_check.clear();
true
}
fn check_for_duplicates(&mut self, stop_receiver: Option<&Receiver<()>>, progress_sender: Option<&futures::channel::mpsc::Sender<ProgressData>>) -> bool {
@ -589,6 +593,10 @@ impl SameMusic {
progress_thread_handle.join().unwrap();
Common::print_time(start_time, SystemTime::now(), "check_for_duplicates".to_string());
// Clear unused data
self.music_entries.clear();
true
}

View file

@ -265,8 +265,8 @@ impl SimilarImages {
.to_lowercase();
// Checking allowed image extensions
let allowed_image_extensions = ["jpg", "png", "bmp", "ico", "webp", "tiff", "dds"];
if !allowed_image_extensions.iter().any(|e| file_name_lowercase.ends_with(e)) {
let allowed_image_extensions = ["jpg", "jpeg", "png", "bmp", "ico", "tiff", "pnm", "tga", "ff", "gif"];
if !allowed_image_extensions.iter().any(|e| file_name_lowercase.ends_with(format!(".{}", e).as_str())) {
continue 'dir;
}
@ -326,20 +326,20 @@ impl SimilarImages {
None => Default::default(),
};
let mut hashes_already_counted: HashMap<String, FileEntry> = Default::default();
let mut hashes_to_check: HashMap<String, FileEntry> = Default::default();
let mut records_already_cached: HashMap<String, FileEntry> = Default::default();
let mut non_cached_files_to_check: HashMap<String, FileEntry> = Default::default();
for (name, file_entry) in &self.images_to_check {
#[allow(clippy::collapsible_if)]
if !loaded_hash_map.contains_key(name) {
// If loaded data doesn't contains current image info
hashes_to_check.insert(name.clone(), file_entry.clone());
non_cached_files_to_check.insert(name.clone(), file_entry.clone());
} else {
if file_entry.size != loaded_hash_map.get(name).unwrap().size || file_entry.modified_date != loaded_hash_map.get(name).unwrap().modified_date {
// When size or modification date of image changed, then it is clear that is different image
hashes_to_check.insert(name.clone(), file_entry.clone());
non_cached_files_to_check.insert(name.clone(), file_entry.clone());
} else {
// Checking may be omitted when already there is entry with same size and modification date
hashes_already_counted.insert(name.clone(), loaded_hash_map.get(name).unwrap().clone());
records_already_cached.insert(name.clone(), loaded_hash_map.get(name).unwrap().clone());
}
}
}
@ -358,7 +358,7 @@ impl SimilarImages {
let mut progress_send = progress_sender.clone();
let progress_thread_run = progress_thread_run.clone();
let atomic_file_counter = atomic_file_counter.clone();
let images_to_check = hashes_to_check.len();
let images_to_check = non_cached_files_to_check.len();
progress_thread_handle = thread::spawn(move || loop {
progress_send
.try_send(ProgressData {
@ -377,7 +377,7 @@ impl SimilarImages {
progress_thread_handle = thread::spawn(|| {});
}
//// PROGRESS THREAD END
let mut vec_file_entry: Vec<(FileEntry, Node)> = hashes_to_check
let mut vec_file_entry: Vec<(FileEntry, Node)> = non_cached_files_to_check
.par_iter()
.map(|file_entry| {
atomic_file_counter.fetch_add(1, Ordering::Relaxed);
@ -416,7 +416,7 @@ impl SimilarImages {
let hash_map_modification = SystemTime::now();
// Just connect loaded results with already calculated hashes
for (_name, file_entry) in hashes_already_counted {
for (_name, file_entry) in records_already_cached {
vec_file_entry.push((file_entry.clone(), file_entry.hash));
}
@ -457,15 +457,15 @@ impl SimilarImages {
// Maybe also add here progress report
let mut new_vector: Vec<Vec<FileEntry>> = Vec::new();
let mut hashes_to_check = self.image_hashes.clone();
let mut non_cached_files_to_check = self.image_hashes.clone();
for (hash, vec_file_entry) in &self.image_hashes {
if stop_receiver.is_some() && stop_receiver.unwrap().try_recv().is_ok() {
return false;
}
if !hashes_to_check.contains_key(hash) {
if !non_cached_files_to_check.contains_key(hash) {
continue;
}
hashes_to_check.remove(hash);
non_cached_files_to_check.remove(hash);
let vector_with_found_similar_hashes = self.bktree.find(hash, similarity).collect::<Vec<_>>();
if vector_with_found_similar_hashes.len() == 1 && vec_file_entry.len() == 1 {
@ -493,7 +493,7 @@ impl SimilarImages {
panic!("I'm not sure if same hash can have distance > 0");
}
if let Some(vec_file_entry) = hashes_to_check.get(*similar_hash) {
if let Some(vec_file_entry) = non_cached_files_to_check.get(*similar_hash) {
vector_of_similar_images.append(
&mut (vec_file_entry
.iter()
@ -515,7 +515,7 @@ impl SimilarImages {
})
.collect::<Vec<_>>()),
);
hashes_to_check.remove(*similar_hash);
non_cached_files_to_check.remove(*similar_hash);
}
}
if vector_of_similar_images.len() > 1 {
@ -527,6 +527,12 @@ impl SimilarImages {
self.similar_vectors = new_vector;
Common::print_time(hash_map_modification, SystemTime::now(), "sort_images - selecting data from BtreeMap".to_string());
// Clean unused data
self.image_hashes = Default::default();
self.images_to_check = Default::default();
self.bktree = BKTree::new(Hamming);
true
}
@ -659,11 +665,11 @@ fn save_hashes_to_file(hashmap: &HashMap<String, FileEntry>, text_messages: &mut
text_messages.messages.push(format!("Cannot create config dir {}", cache_dir.display()));
return;
}
let config_file = cache_dir.join(CACHE_FILE_NAME);
let file_handler = match OpenOptions::new().truncate(true).write(true).create(true).open(&config_file) {
let cache_file = cache_dir.join(CACHE_FILE_NAME);
let file_handler = match OpenOptions::new().truncate(true).write(true).create(true).open(&cache_file) {
Ok(t) => t,
Err(_) => {
text_messages.messages.push(format!("Cannot create or open cache file {}", config_file.display()));
text_messages.messages.push(format!("Cannot create or open cache file {}", cache_file.display()));
return;
}
};
@ -680,7 +686,7 @@ fn save_hashes_to_file(hashmap: &HashMap<String, FileEntry>, text_messages: &mut
string += file_entry.hash[file_entry.hash.len() - 1].to_string().as_str();
if writeln!(writer, "{}", string).is_err() {
text_messages.messages.push(format!("Failed to save some data to cache file {}", config_file.display()));
text_messages.messages.push(format!("Failed to save some data to cache file {}", cache_file.display()));
return;
};
}
@ -688,26 +694,13 @@ fn save_hashes_to_file(hashmap: &HashMap<String, FileEntry>, text_messages: &mut
}
fn load_hashes_from_file(text_messages: &mut Messages) -> Option<HashMap<String, FileEntry>> {
if let Some(proj_dirs) = ProjectDirs::from("pl", "Qarmin", "Czkawka") {
let mut cache_dir = PathBuf::from(proj_dirs.cache_dir());
let mut config_file = cache_dir.join(CACHE_FILE_NAME);
let file_handler = match OpenOptions::new().read(true).open(&config_file) {
let cache_dir = PathBuf::from(proj_dirs.cache_dir());
let cache_file = cache_dir.join(CACHE_FILE_NAME);
let file_handler = match OpenOptions::new().read(true).open(&cache_file) {
Ok(t) => t,
Err(_) => {
text_messages.messages.push(format!("Cannot find or open cache file {}", config_file.display()));
// return None; // Enable when removing compatibility section
// Compatibility for upgrading project from 2.1 to 2.2
{
cache_dir = PathBuf::from(proj_dirs.config_dir());
config_file = cache_dir.join(CACHE_FILE_NAME);
match OpenOptions::new().read(true).open(&config_file) {
Ok(t) => t,
Err(_) => {
text_messages.messages.push(format!("Cannot find or open cache file {}", config_file.display()));
return None;
}
}
}
// End of compatibility section to remove after release 2.2 version
// text_messages.messages.push(format!("Cannot find or open cache file {}", cache_file.display())); // This shouldn't be write to output
return None;
}
};
@ -720,20 +713,26 @@ fn load_hashes_from_file(text_messages: &mut Messages) -> Option<HashMap<String,
let line = match line {
Ok(t) => t,
Err(_) => {
text_messages.warnings.push(format!("Failed to load line number {} from cache file {}", index + 1, config_file.display()));
text_messages.warnings.push(format!("Failed to load line number {} from cache file {}", index + 1, cache_file.display()));
return None;
}
};
let uuu = line.split("//").collect::<Vec<&str>>();
if uuu.len() != 12 {
text_messages.warnings.push(format!("Found invalid data in line {} - ({}) in cache file {}", index + 1, line, config_file.display()));
return None;
text_messages.warnings.push(format!("Found invalid data in line {} - ({}) in cache file {}", index + 1, line, cache_file.display()));
continue;
}
// Don't load cache data if destination file not exists
if Path::new(uuu[0]).exists() {
let mut hash: Node = [0u8; 8];
for i in 0..hash.len() {
hash[i] = uuu[4 + i].parse::<u8>().unwrap();
hash[i] = match uuu[4 + i].parse::<u8>() {
Ok(t) => t,
Err(_) => {
text_messages.warnings.push(format!("Found invalid hash value in line {} - ({}) in cache file {}", index + 1, line, cache_file.display()));
continue;
}
};
}
#[cfg(debug_assertions)]
@ -756,9 +755,21 @@ fn load_hashes_from_file(text_messages: &mut Messages) -> Option<HashMap<String,
uuu[0].to_string(),
FileEntry {
path: PathBuf::from(uuu[0]),
size: uuu[1].parse::<u64>().unwrap(),
size: match uuu[1].parse::<u64>() {
Ok(t) => t,
Err(_) => {
text_messages.warnings.push(format!("Found invalid size value in line {} - ({}) in cache file {}", index + 1, line, cache_file.display()));
continue;
}
},
dimensions: uuu[2].to_string(),
modified_date: uuu[3].parse::<u64>().unwrap(),
modified_date: match uuu[3].parse::<u64>() {
Ok(t) => t,
Err(_) => {
text_messages.warnings.push(format!("Found invalid modified date value in line {} - ({}) in cache file {}", index + 1, line, cache_file.display()));
continue;
}
},
hash,
similarity: Similarity::None,
},

View file

@ -385,6 +385,10 @@ impl ZeroedFiles {
self.information.number_of_zeroed_files = self.zeroed_files.len();
Common::print_time(start_time, SystemTime::now(), "search for zeroed_files".to_string());
//Clean unused data
self.files_to_check.clear();
true
}

View file

@ -1,6 +1,6 @@
[package]
name = "czkawka_gui"
version = "2.2.0"
version = "2.3.0"
authors = ["Rafał Mikrut <mikrutrafal@protonmail.com>"]
edition = "2018"
description = "GTK frontend of Czkawka"

View file

@ -2031,6 +2031,30 @@ Author: Rafał Mikrut
<property name="tab_fill">False</property>
</packing>
</child>
<child>
<object class="GtkScrolledWindow" id="scrolled_window_broken_files">
<property name="visible">True</property>
<property name="can_focus">True</property>
<property name="shadow_type">in</property>
<child>
<placeholder/>
</child>
</object>
<packing>
<property name="position">9</property>
</packing>
</child>
<child type="tab">
<object class="GtkLabel">
<property name="visible">True</property>
<property name="can_focus">False</property>
<property name="label" translatable="yes">Broken Files</property>
</object>
<packing>
<property name="position">9</property>
<property name="tab_fill">False</property>
</packing>
</child>
</object>
<packing>
<property name="resize">True</property>
@ -2387,7 +2411,7 @@ Author: Rafał Mikrut
<property name="can_focus">True</property>
<property name="editable">False</property>
<property name="has_frame">False</property>
<property name="text" translatable="yes">Czkawka 2.2.0</property>
<property name="text" translatable="yes">Czkawka 2.3.0</property>
<property name="xalign">1</property>
<property name="shadow_type">none</property>
<property name="caps_lock_warning">False</property>

View file

@ -23,6 +23,7 @@ pub fn connect_button_delete(gui_data: &GuiData) {
let tree_view_zeroed_files_finder = gui_data.main_notebook.tree_view_zeroed_files_finder.clone();
let tree_view_same_music_finder = gui_data.main_notebook.tree_view_same_music_finder.clone();
let tree_view_invalid_symlinks = gui_data.main_notebook.tree_view_invalid_symlinks.clone();
let tree_view_broken_files = gui_data.main_notebook.tree_view_broken_files.clone();
let check_button_settings_confirm_deletion = gui_data.upper_notebook.check_button_settings_confirm_deletion.clone();
let image_preview_similar_images = gui_data.main_notebook.image_preview_similar_images.clone();
@ -88,6 +89,9 @@ pub fn connect_button_delete(gui_data: &GuiData) {
NotebookMainEnum::Symlinks => {
basic_remove(&tree_view_invalid_symlinks.clone(), ColumnsInvalidSymlinks::Name as i32, ColumnsInvalidSymlinks::Path as i32, &gui_data);
}
NotebookMainEnum::BrokenFiles => {
basic_remove(&tree_view_broken_files.clone(), ColumnsBrokenFiles::Name as i32, ColumnsBrokenFiles::Path as i32, &gui_data);
}
}
});
}

View file

@ -16,6 +16,7 @@ pub fn connect_button_save(gui_data: &GuiData) {
let shared_same_music_state = gui_data.shared_same_music_state.clone();
let shared_zeroed_files_state = gui_data.shared_zeroed_files_state.clone();
let shared_same_invalid_symlinks = gui_data.shared_same_invalid_symlinks.clone();
let shared_broken_files_state = gui_data.shared_broken_files_state.clone();
let notebook_main = gui_data.main_notebook.notebook_main.clone();
buttons_save.connect_clicked(move |_| {
let file_name;
@ -66,6 +67,11 @@ pub fn connect_button_save(gui_data: &GuiData) {
shared_same_invalid_symlinks.borrow_mut().save_results_to_file(file_name);
}
NotebookMainEnum::BrokenFiles => {
file_name = "results_broken_files.txt";
shared_broken_files_state.borrow_mut().save_results_to_file(file_name);
}
}
post_save_things(file_name, &to_notebook_main_enum(notebook_main.get_current_page().unwrap()), &gui_data);
});

View file

@ -5,6 +5,7 @@ use crate::gui_data::GuiData;
use crate::help_functions::*;
use crate::notebook_enums::*;
use czkawka_core::big_file::BigFile;
use czkawka_core::broken_files::BrokenFiles;
use czkawka_core::duplicate::DuplicateFinder;
use czkawka_core::empty_files::EmptyFiles;
use czkawka_core::empty_folder::EmptyFolder;
@ -15,7 +16,7 @@ use czkawka_core::temporary::Temporary;
use czkawka_core::zeroed::ZeroedFiles;
use glib::Sender;
use gtk::prelude::*;
use gtk::WindowPosition;
use gtk::{ResponseType, WindowPosition};
use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::Arc;
use std::thread;
@ -33,7 +34,9 @@ pub fn connect_button_search(
futures_sender_temporary: futures::channel::mpsc::Sender<temporary::ProgressData>,
futures_sender_zeroed: futures::channel::mpsc::Sender<zeroed::ProgressData>,
futures_sender_invalid_symlinks: futures::channel::mpsc::Sender<invalid_symlinks::ProgressData>,
futures_sender_broken_files: futures::channel::mpsc::Sender<broken_files::ProgressData>,
) {
let stop_sender = gui_data.stop_sender.clone();
let entry_info = gui_data.entry_info.clone();
let notebook_main = gui_data.main_notebook.notebook_main.clone();
let tree_view_included_directories = gui_data.upper_notebook.tree_view_included_directories.clone();
@ -74,6 +77,7 @@ pub fn connect_button_search(
let tree_view_similar_images_finder = gui_data.main_notebook.tree_view_similar_images_finder.clone();
let tree_view_zeroed_files_finder = gui_data.main_notebook.tree_view_zeroed_files_finder.clone();
let tree_view_invalid_symlinks = gui_data.main_notebook.tree_view_invalid_symlinks.clone();
let tree_view_broken_files = gui_data.main_notebook.tree_view_broken_files.clone();
let text_view_errors = gui_data.text_view_errors.clone();
let dialog_progress = gui_data.progress_dialog.dialog_progress.clone();
let label_stage = gui_data.progress_dialog.label_stage.clone();
@ -107,6 +111,9 @@ pub fn connect_button_search(
reset_text_view(&text_view_errors);
let glib_stop_sender = glib_stop_sender.clone();
let stop_receiver = stop_receiver.clone();
match to_notebook_main_enum(notebook_main.get_current_page().unwrap()) {
NotebookMainEnum::Duplicate => {
label_stage.show();
@ -129,9 +136,6 @@ pub fn connect_button_search(
}
let minimal_file_size = entry_duplicate_minimal_size.get_text().as_str().parse::<u64>().unwrap_or(1024);
let glib_stop_sender = glib_stop_sender.clone();
let stop_receiver = stop_receiver.clone();
let futures_sender_duplicate_files = futures_sender_duplicate_files.clone();
// Find duplicates
thread::spawn(move || {
@ -154,9 +158,6 @@ pub fn connect_button_search(
get_list_store(&tree_view_empty_files_finder).clear();
let glib_stop_sender = glib_stop_sender.clone();
let stop_receiver = stop_receiver.clone();
let futures_sender_empty_files = futures_sender_empty_files.clone();
// Find empty files
thread::spawn(move || {
@ -178,9 +179,6 @@ pub fn connect_button_search(
get_list_store(&tree_view_empty_folder_finder).clear();
let glib_stop_sender = glib_stop_sender.clone();
let stop_receiver = stop_receiver.clone();
let futures_sender_empty_folder = futures_sender_empty_folder.clone();
// Find empty folders
thread::spawn(move || {
@ -201,8 +199,6 @@ pub fn connect_button_search(
let numbers_of_files_to_check = entry_big_files_number.get_text().as_str().parse::<usize>().unwrap_or(50);
let glib_stop_sender = glib_stop_sender.clone();
let stop_receiver = stop_receiver.clone();
let futures_sender_big_file = futures_sender_big_file.clone();
// Find big files
thread::spawn(move || {
@ -224,9 +220,6 @@ pub fn connect_button_search(
get_list_store(&tree_view_temporary_files_finder).clear();
let glib_stop_sender = glib_stop_sender.clone();
let stop_receiver = stop_receiver.clone();
let futures_sender_temporary = futures_sender_temporary.clone();
// Find temporary files
thread::spawn(move || {
@ -249,9 +242,6 @@ pub fn connect_button_search(
get_list_store(&tree_view_similar_images_finder).clear();
let glib_stop_sender = glib_stop_sender.clone();
let stop_receiver = stop_receiver.clone();
let minimal_file_size = entry_similar_images_minimal_size.get_text().as_str().parse::<u64>().unwrap_or(1024 * 16);
let similarity;
@ -293,9 +283,6 @@ pub fn connect_button_search(
get_list_store(&tree_view_zeroed_files_finder).clear();
let glib_stop_sender = glib_stop_sender.clone();
let stop_receiver = stop_receiver.clone();
let futures_sender_zeroed = futures_sender_zeroed.clone();
// Find zeroed files
thread::spawn(move || {
@ -338,9 +325,6 @@ pub fn connect_button_search(
}
if music_similarity != MusicSimilarity::NONE {
let glib_stop_sender = glib_stop_sender.clone();
let stop_receiver = stop_receiver.clone();
let futures_sender_same_music = futures_sender_same_music.clone();
// Find Similar music
thread::spawn(move || {
@ -369,8 +353,6 @@ pub fn connect_button_search(
get_list_store(&tree_view_invalid_symlinks).clear();
let glib_stop_sender = glib_stop_sender.clone();
let stop_receiver = stop_receiver.clone();
let futures_sender_invalid_symlinks = futures_sender_invalid_symlinks.clone();
thread::spawn(move || {
@ -384,11 +366,38 @@ pub fn connect_button_search(
let _ = glib_stop_sender.send(Message::InvalidSymlinks(isf));
});
}
NotebookMainEnum::BrokenFiles => {
label_stage.show();
grid_progress_stages.show();
dialog_progress.resize(1, 1);
get_list_store(&tree_view_broken_files).clear();
let futures_sender_broken_files = futures_sender_broken_files.clone();
thread::spawn(move || {
let mut br = BrokenFiles::new();
br.set_included_directory(included_directories);
br.set_excluded_directory(excluded_directories);
br.set_recursive_search(recursive_search);
br.set_excluded_items(excluded_items);
br.find_broken_files(Some(&stop_receiver), Some(&futures_sender_broken_files));
let _ = glib_stop_sender.send(Message::BrokenFiles(br));
});
}
}
// Show progress dialog
if show_dialog.load(Ordering::Relaxed) {
dialog_progress.show();
let response = dialog_progress.run();
if response == ResponseType::DeleteEvent {
stop_sender.send(()).unwrap();
}
dialog_progress.hide();
}
});
}

View file

@ -28,6 +28,7 @@ pub fn connect_button_select(gui_data: &GuiData) {
hashmap.insert(NotebookMainEnum::Symlinks, vec!["all", "reverse", "custom"]);
hashmap.insert(NotebookMainEnum::Zeroed, vec!["all", "reverse", "custom"]);
hashmap.insert(NotebookMainEnum::Temporary, vec!["all", "reverse", "custom"]);
hashmap.insert(NotebookMainEnum::BrokenFiles, vec!["all", "reverse", "custom"]);
}
}

View file

@ -25,7 +25,9 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
let tree_view_zeroed_files_finder = gui_data.main_notebook.tree_view_zeroed_files_finder.clone();
let shared_empty_folders_state = gui_data.shared_empty_folders_state.clone();
let shared_empty_files_state = gui_data.shared_empty_files_state.clone();
let shared_broken_files_state = gui_data.shared_broken_files_state.clone();
let tree_view_big_files_finder = gui_data.main_notebook.tree_view_big_files_finder.clone();
let tree_view_broken_files = gui_data.main_notebook.tree_view_broken_files.clone();
let tree_view_invalid_symlinks = gui_data.main_notebook.tree_view_invalid_symlinks.clone();
let shared_big_files_state = gui_data.shared_big_files_state.clone();
let shared_same_invalid_symlinks = gui_data.shared_same_invalid_symlinks.clone();
@ -95,6 +97,15 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
let btreemap = df.get_files_sorted_by_names();
for (name, vector) in btreemap.iter().rev() {
// Sort
let vector = if vector.len() > 2 {
let mut vector = vector.clone();
vector.sort_by_key(|e| e.path.clone());
vector
} else {
vector.clone()
};
let values: [&dyn ToValue; 6] = [
&name,
&(format!("{} results", vector.len())),
@ -123,6 +134,15 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
for (size, vectors_vector) in btreemap.iter().rev() {
for vector in vectors_vector {
// Sort
let vector = if vector.len() > 2 {
let mut vector = vector.clone();
vector.sort_by_key(|e| e.path.clone());
vector
} else {
vector.clone()
};
let values: [&dyn ToValue; 6] = [
&(format!("{} x {} ({} bytes)", vector.len(), size.file_size(options::BINARY).unwrap(), size)),
&(format!("{} ({} bytes) lost", ((vector.len() - 1) as u64 * *size as u64).file_size(options::BINARY).unwrap(), (vector.len() - 1) as u64 * *size as u64)),
@ -151,6 +171,15 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
let btreemap = df.get_files_sorted_by_size();
for (size, vector) in btreemap.iter().rev() {
// Sort
let vector = if vector.len() > 2 {
let mut vector = vector.clone();
vector.sort_by_key(|e| e.path.clone());
vector
} else {
vector.clone()
};
let values: [&dyn ToValue; 6] = [
&(format!("{} x {} ({} bytes)", vector.len(), size.file_size(options::BINARY).unwrap(), size)),
&(format!("{} ({} bytes) lost", ((vector.len() - 1) as u64 * *size as u64).file_size(options::BINARY).unwrap(), (vector.len() - 1) as u64 * *size as u64)),
@ -264,6 +293,10 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
let vector = vf.get_empty_files();
// Sort
let mut vector = vector.clone();
vector.sort_by_key(|e| e.path.clone());
for file_entry in vector {
let (directory, file) = split_path(&file_entry.path);
let values: [&dyn ToValue; 3] = [&file, &directory, &(NaiveDateTime::from_timestamp(file_entry.modified_date as i64, 0).to_string())];
@ -359,6 +392,10 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
let vector = tf.get_temporary_files();
// Sort
let mut vector = vector.clone();
vector.sort_by_key(|e| e.path.clone());
for file_entry in vector {
let (directory, file) = split_path(&file_entry.path);
let values: [&dyn ToValue; 3] = [&file, &directory, &(NaiveDateTime::from_timestamp(file_entry.modified_date as i64, 0).to_string())];
@ -404,6 +441,15 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
let vec_struct_similar = sf.get_similar_images();
for vec_file_entry in vec_struct_similar.iter() {
// Sort
let vec_file_entry = if vec_file_entry.len() > 2 {
let mut vec_file_entry = vec_file_entry.clone();
vec_file_entry.sort_by_key(|e| e.path.clone());
vec_file_entry
} else {
vec_file_entry.clone()
};
// Header
let values: [&dyn ToValue; 10] = [
&"".to_string(),
@ -479,6 +525,10 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
let vector = zf.get_zeroed_files();
// Sort
let mut vector = vector.clone();
vector.sort_by_key(|e| e.path.clone());
for file_entry in vector {
let (directory, file) = split_path(&file_entry.path);
let values: [&dyn ToValue; 5] = [
@ -512,7 +562,7 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
}
Message::SameMusic(mf) => {
if mf.get_stopped_search() {
entry_info.set_text("Searching for empty files was stopped by user");
entry_info.set_text("Searching for same music was stopped by user");
} else {
let information = mf.get_information();
let text_messages = mf.get_text_messages();
@ -540,6 +590,15 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
let text: String = "-----".to_string();
for vec_file_entry in vector {
// Sort
let vec_file_entry = if vec_file_entry.len() > 2 {
let mut vec_file_entry = vec_file_entry.clone();
vec_file_entry.sort_by_key(|e| e.path.clone());
vec_file_entry
} else {
vec_file_entry.clone()
};
let values: [&dyn ToValue; 13] = [
&"".to_string(),
&(0),
@ -632,6 +691,10 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
let vector = ifs.get_invalid_symlinks();
// Sort
let mut vector = vector.clone();
vector.sort_by_key(|e| e.symlink_path.clone());
for file_entry in vector {
let (directory, file) = split_path(&file_entry.symlink_path);
let values: [&dyn ToValue; 5] = [
@ -663,6 +726,54 @@ pub fn connect_compute_results(gui_data: &GuiData, glib_stop_receiver: Receiver<
}
}
}
Message::BrokenFiles(br) => {
if br.get_stopped_search() {
entry_info.set_text("Searching for broken files was stopped by user");
} else {
let information = br.get_information();
let text_messages = br.get_text_messages();
let broken_files_number: usize = information.number_of_broken_files;
entry_info.set_text(format!("Found {} broken files.", broken_files_number).as_str());
// Create GUI
{
let list_store = get_list_store(&tree_view_broken_files);
let col_indices = [0, 1, 2, 3];
let vector = br.get_broken_files();
// Sort
let mut vector = vector.clone();
vector.sort_by_key(|e| e.path.clone());
for file_entry in vector {
let (directory, file) = split_path(&file_entry.path);
let values: [&dyn ToValue; 4] = [&file, &directory, &file_entry.error_string, &(NaiveDateTime::from_timestamp(file_entry.modified_date as i64, 0).to_string())];
list_store.set(&list_store.append(), &col_indices, &values);
}
print_text_messages_to_text_view(text_messages, &text_view_errors);
}
// Set state
{
*shared_broken_files_state.borrow_mut() = br;
if broken_files_number > 0 {
*shared_buttons.borrow_mut().get_mut(&NotebookMainEnum::BrokenFiles).unwrap().get_mut("save").unwrap() = true;
*shared_buttons.borrow_mut().get_mut(&NotebookMainEnum::BrokenFiles).unwrap().get_mut("delete").unwrap() = true;
*shared_buttons.borrow_mut().get_mut(&NotebookMainEnum::BrokenFiles).unwrap().get_mut("select").unwrap() = true;
} else {
*shared_buttons.borrow_mut().get_mut(&NotebookMainEnum::BrokenFiles).unwrap().get_mut("save").unwrap() = false;
*shared_buttons.borrow_mut().get_mut(&NotebookMainEnum::BrokenFiles).unwrap().get_mut("delete").unwrap() = false;
*shared_buttons.borrow_mut().get_mut(&NotebookMainEnum::BrokenFiles).unwrap().get_mut("select").unwrap() = false;
}
set_buttons(&mut *shared_buttons.borrow_mut().get_mut(&NotebookMainEnum::BrokenFiles).unwrap(), &buttons_array, &buttons_names);
}
}
}
}
// Returning false here would close the receiver and have senders fail
glib::Continue(true)

View file

@ -761,6 +761,30 @@ pub fn connect_popovers(gui_data: &GuiData) {
column_size_as_bytes: Some(ColumnsZeroedFiles::SizeAsBytes as i32),
column_modification_as_secs: None,
},
PopoverObject {
notebook_type: NotebookMainEnum::BrokenFiles,
available_modes: vec!["all", "reverse", "custom"].iter().map(|e| e.to_string()).collect(),
tree_view: gui_data.main_notebook.tree_view_broken_files.clone(),
column_path: Some(ColumnsBrokenFiles::Path as i32),
column_name: Some(ColumnsBrokenFiles::Name as i32),
column_color: None,
column_dimensions: None,
column_size: None,
column_size_as_bytes: None,
column_modification_as_secs: None,
},
PopoverObject {
notebook_type: NotebookMainEnum::Symlinks,
available_modes: vec!["all", "reverse", "custom"].iter().map(|e| e.to_string()).collect(),
tree_view: gui_data.main_notebook.tree_view_invalid_symlinks.clone(),
column_path: Some(ColumnsInvalidSymlinks::Path as i32),
column_name: Some(ColumnsInvalidSymlinks::Name as i32),
column_color: None,
column_dimensions: None,
column_size: None,
column_size_as_bytes: None,
column_modification_as_secs: None,
},
];
let popover_select = gui_data.popovers.popover_select.clone();

View file

@ -1,6 +1,6 @@
use crate::gui_data::GuiData;
use czkawka_core::{big_file, duplicate, empty_files, empty_folder, invalid_symlinks, same_music, similar_images, temporary, zeroed};
use czkawka_core::{big_file, broken_files, duplicate, empty_files, empty_folder, invalid_symlinks, same_music, similar_images, temporary, zeroed};
use futures::StreamExt;
use gtk::{LabelExt, ProgressBarExt, WidgetExt};
@ -17,6 +17,7 @@ pub fn connect_progress_window(
mut futures_receiver_temporary: futures::channel::mpsc::Receiver<temporary::ProgressData>,
mut futures_receiver_zeroed: futures::channel::mpsc::Receiver<zeroed::ProgressData>,
mut futures_receiver_invalid_symlinks: futures::channel::mpsc::Receiver<invalid_symlinks::ProgressData>,
mut futures_receiver_broken_files: futures::channel::mpsc::Receiver<broken_files::ProgressData>,
) {
let main_context = glib::MainContext::default();
@ -241,4 +242,35 @@ pub fn connect_progress_window(
};
main_context.spawn_local(future);
}
{
// Broken Files
let label_stage = gui_data.progress_dialog.label_stage.clone();
let progress_bar_current_stage = gui_data.progress_dialog.progress_bar_current_stage.clone();
let progress_bar_all_stages = gui_data.progress_dialog.progress_bar_all_stages.clone();
let future = async move {
while let Some(item) = futures_receiver_broken_files.next().await {
match item.current_stage {
0 => {
progress_bar_current_stage.hide();
label_stage.set_text(format!("Scanned {} files", item.files_checked).as_str());
}
1 => {
progress_bar_current_stage.show();
if item.files_to_check != 0 {
progress_bar_all_stages.set_fraction((1f64 + (item.files_checked) as f64 / item.files_to_check as f64) / (item.max_stage + 1) as f64);
progress_bar_current_stage.set_fraction((item.files_checked) as f64 / item.files_to_check as f64);
} else {
progress_bar_all_stages.set_fraction((1f64) / (item.max_stage + 1) as f64);
progress_bar_current_stage.set_fraction(0f64);
}
label_stage.set_text(format!("Checking {}/{} files", item.files_checked, item.files_to_check).as_str());
}
_ => {
panic!();
}
}
}
};
main_context.spawn_local(future);
}
}

View file

@ -443,3 +443,43 @@ pub fn create_tree_view_invalid_symlinks(tree_view: &mut gtk::TreeView) {
tree_view.set_vexpand(true);
}
pub fn create_tree_view_broken_files(tree_view: &mut gtk::TreeView) {
let renderer = gtk::CellRendererText::new();
let column: gtk::TreeViewColumn = TreeViewColumn::new();
column.pack_start(&renderer, true);
column.set_title("Name");
column.set_resizable(true);
column.set_min_width(50);
column.add_attribute(&renderer, "text", ColumnsBrokenFiles::Name as i32);
tree_view.append_column(&column);
let renderer = gtk::CellRendererText::new();
let column: gtk::TreeViewColumn = TreeViewColumn::new();
column.pack_start(&renderer, true);
column.set_title("Path");
column.set_resizable(true);
column.set_min_width(50);
column.add_attribute(&renderer, "text", ColumnsBrokenFiles::Path as i32);
tree_view.append_column(&column);
let renderer = gtk::CellRendererText::new();
let column: gtk::TreeViewColumn = TreeViewColumn::new();
column.pack_start(&renderer, true);
column.set_title("ErrorType");
column.set_resizable(true);
column.set_min_width(50);
column.add_attribute(&renderer, "text", ColumnsBrokenFiles::ErrorType as i32);
tree_view.append_column(&column);
let renderer = gtk::CellRendererText::new();
let column: gtk::TreeViewColumn = TreeViewColumn::new();
column.pack_start(&renderer, true);
column.set_title("Modification Date");
column.set_resizable(true);
column.set_min_width(50);
column.add_attribute(&renderer, "text", ColumnsBrokenFiles::Modification as i32);
tree_view.append_column(&column);
tree_view.set_vexpand(true);
}

View file

@ -83,6 +83,15 @@ pub fn opening_double_click_function_invalid_symlinks(tree_view: &gtk::TreeView,
gtk::Inhibit(false)
}
pub fn opening_double_click_function_broken_files(tree_view: &gtk::TreeView, event: &gdk::EventButton) -> gtk::Inhibit {
if event.get_event_type() == gdk::EventType::DoubleButtonPress && event.get_button() == 1 {
common_open_function(tree_view, ColumnsInvalidSymlinks::Name as i32, ColumnsInvalidSymlinks::Path as i32, OpenMode::PathAndName);
} else if event.get_event_type() == gdk::EventType::DoubleButtonPress && event.get_button() == 3 {
common_open_function(tree_view, ColumnsInvalidSymlinks::Name as i32, ColumnsInvalidSymlinks::Path as i32, OpenMode::OnlyPath);
}
gtk::Inhibit(false)
}
pub enum OpenMode {
OnlyPath,
PathAndName,

View file

@ -7,6 +7,7 @@ use crate::gui_upper_notepad::GUIUpperNotebook;
use crate::notebook_enums::*;
use crossbeam_channel::unbounded;
use czkawka_core::big_file::BigFile;
use czkawka_core::broken_files::BrokenFiles;
use czkawka_core::duplicate::DuplicateFinder;
use czkawka_core::empty_files::EmptyFiles;
use czkawka_core::empty_folder::EmptyFolder;
@ -52,6 +53,7 @@ pub struct GuiData {
pub shared_zeroed_files_state: Rc<RefCell<ZeroedFiles>>,
pub shared_same_music_state: Rc<RefCell<SameMusic>>,
pub shared_same_invalid_symlinks: Rc<RefCell<InvalidSymlinks>>,
pub shared_broken_files_state: Rc<RefCell<BrokenFiles>>,
//// Entry
pub entry_info: gtk::Entry,
@ -124,6 +126,7 @@ impl GuiData {
let shared_zeroed_files_state: Rc<RefCell<_>> = Rc::new(RefCell::new(ZeroedFiles::new()));
let shared_same_music_state: Rc<RefCell<_>> = Rc::new(RefCell::new(SameMusic::new()));
let shared_same_invalid_symlinks: Rc<RefCell<_>> = Rc::new(RefCell::new(InvalidSymlinks::new()));
let shared_broken_files_state: Rc<RefCell<_>> = Rc::new(RefCell::new(BrokenFiles::new()));
//// Entry
let entry_info: gtk::Entry = builder.get_object("entry_info").unwrap();
@ -155,6 +158,7 @@ impl GuiData {
shared_zeroed_files_state,
shared_same_music_state,
shared_same_invalid_symlinks,
shared_broken_files_state,
entry_info,
text_view_errors,
scrolled_window_errors,

View file

@ -14,6 +14,7 @@ pub struct GUIMainNotebook {
pub scrolled_window_zeroed_files_finder: gtk::ScrolledWindow,
pub scrolled_window_same_music_finder: gtk::ScrolledWindow,
pub scrolled_window_invalid_symlinks: gtk::ScrolledWindow,
pub scrolled_window_broken_files: gtk::ScrolledWindow,
pub tree_view_duplicate_finder: gtk::TreeView,
pub tree_view_empty_folder_finder: gtk::TreeView,
@ -24,6 +25,7 @@ pub struct GUIMainNotebook {
pub tree_view_zeroed_files_finder: gtk::TreeView,
pub tree_view_same_music_finder: gtk::TreeView,
pub tree_view_invalid_symlinks: gtk::TreeView,
pub tree_view_broken_files: gtk::TreeView,
pub entry_similar_images_minimal_size: gtk::Entry,
pub entry_duplicate_minimal_size: gtk::Entry,
@ -67,6 +69,7 @@ impl GUIMainNotebook {
let scrolled_window_zeroed_files_finder: gtk::ScrolledWindow = builder.get_object("scrolled_window_zeroed_files_finder").unwrap();
let scrolled_window_same_music_finder: gtk::ScrolledWindow = builder.get_object("scrolled_window_same_music_finder").unwrap();
let scrolled_window_invalid_symlinks: gtk::ScrolledWindow = builder.get_object("scrolled_window_invalid_symlinks").unwrap();
let scrolled_window_broken_files: gtk::ScrolledWindow = builder.get_object("scrolled_window_broken_files").unwrap();
let tree_view_duplicate_finder: gtk::TreeView = TreeView::new();
let tree_view_empty_folder_finder: gtk::TreeView = TreeView::new();
@ -77,6 +80,7 @@ impl GUIMainNotebook {
let tree_view_zeroed_files_finder: gtk::TreeView = TreeView::new();
let tree_view_same_music_finder: gtk::TreeView = TreeView::new();
let tree_view_invalid_symlinks: gtk::TreeView = TreeView::new();
let tree_view_broken_files: gtk::TreeView = TreeView::new();
let entry_similar_images_minimal_size: gtk::Entry = builder.get_object("entry_similar_images_minimal_size").unwrap();
let entry_duplicate_minimal_size: gtk::Entry = builder.get_object("entry_duplicate_minimal_size").unwrap();
@ -116,6 +120,7 @@ impl GUIMainNotebook {
scrolled_window_zeroed_files_finder,
scrolled_window_same_music_finder,
scrolled_window_invalid_symlinks,
scrolled_window_broken_files,
tree_view_duplicate_finder,
tree_view_empty_folder_finder,
tree_view_empty_files_finder,
@ -125,6 +130,7 @@ impl GUIMainNotebook {
tree_view_zeroed_files_finder,
tree_view_same_music_finder,
tree_view_invalid_symlinks,
tree_view_broken_files,
entry_similar_images_minimal_size,
entry_duplicate_minimal_size,
entry_big_files_number,

View file

@ -1,4 +1,5 @@
use czkawka_core::big_file::BigFile;
use czkawka_core::broken_files::BrokenFiles;
use czkawka_core::common_messages::Messages;
use czkawka_core::duplicate::DuplicateFinder;
use czkawka_core::empty_files::EmptyFiles;
@ -24,6 +25,7 @@ pub enum Message {
ZeroedFiles(ZeroedFiles),
SameMusic(SameMusic),
InvalidSymlinks(InvalidSymlinks),
BrokenFiles(BrokenFiles),
}
pub enum ColumnsDuplicates {
@ -104,6 +106,13 @@ pub enum ColumnsInvalidSymlinks {
Modification,
}
pub enum ColumnsBrokenFiles {
Name = 0,
Path,
ErrorType,
Modification,
}
pub const TEXT_COLOR: &str = "#ffffff";
pub const MAIN_ROW_COLOR: &str = "#343434";
pub const HEADER_ROW_COLOR: &str = "#272727";

View file

@ -29,6 +29,7 @@ pub fn initialize_gui(gui_data: &mut GuiData) {
let scrolled_window_same_music_finder = gui_data.main_notebook.scrolled_window_same_music_finder.clone();
let scrolled_window_invalid_symlinks = gui_data.main_notebook.scrolled_window_invalid_symlinks.clone();
let scrolled_window_zeroed_files_finder = gui_data.main_notebook.scrolled_window_zeroed_files_finder.clone();
let scrolled_window_broken_files = gui_data.main_notebook.scrolled_window_broken_files.clone();
let scrolled_window_included_directories = gui_data.upper_notebook.scrolled_window_included_directories.clone();
let scrolled_window_excluded_directories = gui_data.upper_notebook.scrolled_window_excluded_directories.clone();
@ -420,6 +421,34 @@ pub fn initialize_gui(gui_data: &mut GuiData) {
gtk::Inhibit(false)
});
}
// Broken Files
{
let col_types: [glib::types::Type; 4] = [glib::types::Type::String, glib::types::Type::String, glib::types::Type::String, glib::types::Type::String];
let list_store: gtk::ListStore = gtk::ListStore::new(&col_types);
let mut tree_view: gtk::TreeView = TreeView::with_model(&list_store);
tree_view.get_selection().set_mode(SelectionMode::Multiple);
create_tree_view_broken_files(&mut tree_view);
tree_view.connect_button_press_event(opening_double_click_function_broken_files);
gui_data.main_notebook.tree_view_broken_files = tree_view.clone();
scrolled_window_broken_files.add(&tree_view);
scrolled_window_broken_files.show_all();
let gui_data = gui_data.clone();
tree_view.connect_key_release_event(move |tree_view, e| {
if let Some(button_number) = e.get_keycode() {
// Handle delete button
if button_number == 119 {
basic_remove(&tree_view, ColumnsBrokenFiles::Name as i32, ColumnsBrokenFiles::Path as i32, &gui_data);
}
}
gtk::Inhibit(false)
});
}
}
// Set Included Directory

View file

@ -83,6 +83,7 @@ fn main() {
let (futures_sender_temporary, futures_receiver_temporary): (futures::channel::mpsc::Sender<temporary::ProgressData>, futures::channel::mpsc::Receiver<temporary::ProgressData>) = futures::channel::mpsc::channel(20);
let (futures_sender_zeroed, futures_receiver_zeroed): (futures::channel::mpsc::Sender<zeroed::ProgressData>, futures::channel::mpsc::Receiver<zeroed::ProgressData>) = futures::channel::mpsc::channel(20);
let (futures_sender_invalid_symlinks, futures_receiver_invalid_symlinks): (futures::channel::mpsc::Sender<invalid_symlinks::ProgressData>, futures::channel::mpsc::Receiver<invalid_symlinks::ProgressData>) = futures::channel::mpsc::channel(20);
let (futures_sender_broken_files, futures_receiver_broken_files): (futures::channel::mpsc::Sender<broken_files::ProgressData>, futures::channel::mpsc::Receiver<broken_files::ProgressData>) = futures::channel::mpsc::channel(20);
initialize_gui(&mut gui_data);
reset_configuration(&gui_data, false); // Fallback for invalid loading setting project
@ -102,6 +103,7 @@ fn main() {
futures_sender_temporary,
futures_sender_zeroed,
futures_sender_invalid_symlinks,
futures_sender_broken_files,
);
connect_button_select(&gui_data);
connect_button_stop(&gui_data);
@ -121,6 +123,7 @@ fn main() {
futures_receiver_temporary,
futures_receiver_zeroed,
futures_receiver_invalid_symlinks,
futures_receiver_broken_files,
);
connect_hide_text_view_errors(&gui_data);
connect_settings(&gui_data);

View file

@ -1,4 +1,4 @@
pub const NUMBER_OF_NOTEBOOK_MAIN_TABS: usize = 9;
pub const NUMBER_OF_NOTEBOOK_MAIN_TABS: usize = 10;
pub const NUMBER_OF_NOTEBOOK_UPPER_TABS: usize = 5;
// Needs to be updated when changed order of notebook tabs
@ -13,6 +13,7 @@ pub enum NotebookMainEnum {
SameMusic,
Zeroed,
Symlinks,
BrokenFiles,
}
pub fn to_notebook_main_enum(notebook_number: u32) -> NotebookMainEnum {
match notebook_number {
@ -25,6 +26,7 @@ pub fn to_notebook_main_enum(notebook_number: u32) -> NotebookMainEnum {
6 => NotebookMainEnum::SameMusic,
7 => NotebookMainEnum::Zeroed,
8 => NotebookMainEnum::Symlinks,
9 => NotebookMainEnum::BrokenFiles,
_ => panic!("Invalid Notebook Tab"),
}
}
@ -39,6 +41,7 @@ pub fn get_all_main_tabs() -> [NotebookMainEnum; NUMBER_OF_NOTEBOOK_MAIN_TABS] {
to_notebook_main_enum(6),
to_notebook_main_enum(7),
to_notebook_main_enum(8),
to_notebook_main_enum(9),
]
}

View file

@ -100,7 +100,7 @@ Then, for each selected tag by which we want to search for duplicates, we perfor
### Similar Images
It is a tool for finding similar images that differ e.g. in watermark, size etc.
The tool first collects images with specific extensions that can be checked - `["jpg", "png", "bmp", "ico", "webp", "tiff", "dds"]`.
The tool first collects images with specific extensions that can be checked - `["jpg", "png", "bmp", "ico", "tiff"]`.
Next cached data are loaded from file to prevent hashing twice same file.
Automatically cache which points to non existing data is deleted.
@ -123,25 +123,34 @@ Computed hash data is then thrown into a special tree that allows to compare has
Next this hashes are saved to file, to be able to opens images without needing to hash it more times.
Finally, each hash is compared with the others and if the distance between them is less than the maximum distance specified by the user, the images are considered similar and thrown from the pool of images to be searched.
### Broken Files
This tool is created to find files which are corrupted or have invalid extension.
At first files from specific group(image,archive,audio) are collected and then this files are opened.
If an error happens when opening this file then it means that this file is corrupted or unsupported.
Only some file extensions are supported, because I rely on external crates. Also some false positives may be shown(e.g. https://github.com/image-rs/jpeg-decoder/issues/130) so always open file to check if it is really broken.
## Config/Cache files
For now Czkawka store only 2 files on disk:
- `czkawka_gui_config.txt` - stores configuration of GUI which may be loaded at startup
- `cache_similar_image.txt` - stores cache data and hashes which may be used later without needing to compute image hash again - DO NOT TRY TO EDIT THIS FILE MANUALLY! - editing this file may cause app crashes.
- `cache_similar_image.txt` - stores cache data and hashes which may be used later without needing to compute image hash again - editing this file may cause app crashes.
- `cache_broken_files.txt` - stores cache data of broken files
- `cache_duplicates_Blake3.txt` - stores cache data of duplicated files, to not get too big performance hit when saving/loading file, only already fully hashed files bigger than 5MB are stored. Similar files with replaced `Blake3` to e.g. `SHA256` may be shown, when support for new hashes will be introduced in Czkawka.
First file is located in this path
Config files are located in this path
Linux - `/home/username/.config/czkawka`
Mac - `/Users/username/Library/Application Support/pl.Qarmin.Czkawka`
Windows - `C:\Users\Username\AppData\Roaming\Qarmin\Czkawka\config`
Second with cache here:
Cache should be here:
Linux - `/home/username/.cache/czkawka`
Mac - `/Users/Username/Library/Caches/pl.Qarmin.Czkawka`
Windows - `C:\Users\Username\AppData\Local\Qarmin\Czkawka\cache`
## GUI GTK
<img src="https://user-images.githubusercontent.com/41945903/103002387-14d1b800-452f-11eb-967e-9d5905dd6db5.png" width="800" />
@ -165,12 +174,13 @@ There are several buttons which do different actions:
- Stop - button in progress dialog, allows to easily stop current task. Sometimes it may take a few seconds until all atomic operations ends and GUI will be able to use again
- Select - allows selecting multiple entries at once
- Delete - delete entirely all selected entries
- Symlink - create symlink to selected files(first file is threaten as original and rest will become symlinks)
- Save - save initial state of results
- Hamburger(parallel lines) - used to show/hide bottom text panel which shows warnings/errors
- Add (directories) - adds directories to include or exclude
- Remove (directories) - remove directories to search or to exclude
- Manual Add (directories) - allows to write by hand directories(may be used to write non visible in file manager directories)
- Save current configuration - saves current GUI configuration to configuration file
- Save current configuration - saves current GUI configuration to configuration file
- Load configuration - loads configuration of file and override current GUI config
- Reset configuration - reset current GUI configuration to default
@ -201,4 +211,4 @@ By default all tools only write about results to console, but it is possible wit
- **Manually adding multiple directories**
You can manually edit config file `czkawka_gui_config.txt` and add required directories. After that load configuration.
- **Slow checking of little number similar images**
If you checked before a big amount of images(several tens of thousands) and them still exists on disk, then information's about it are loaded from cache and save to it, even if you have check now only a few images. You can rename cache file `cache_similar_image.txt`(to be able to use it again) or delete it - cache will regenerate but with lower amount of entries it should load and save a lot of faster.
If you checked before a big amount of images(several tens of thousands) and them still exists on disk, then information's about it are loaded from cache and save to it, even if you have check now only a few images. You can rename cache file `cache_similar_image.txt`(to be able to use it again) or delete it - cache will regenerate but with lower amount of entries it should load and save a lot of faster.

View file

@ -1,5 +1,5 @@
#!/bin/bash
NUMBER="2.2.0"
NUMBER="2.3.0"
CZKAWKA_PATH="/home/rafal"
cd "$CZKAWKA_PATH"

View file

@ -1,5 +1,5 @@
#!/bin/bash
NUMBER="2.2.0"
NUMBER="2.3.0"
CZKAWKA_PATH="/home/rafal"
cd "$CZKAWKA_PATH"

View file

@ -1,17 +0,0 @@
#!/bin/bashmisc
NUMBER="2.2.0"
CZKAWKA_PATH="/home/rafal"
cd "$CZKAWKA_PATH"
CZKAWKA_PATH="$CZKAWKA_PATH/czkawka"
rm -rf $CZKAWKA_PATH
git clone https://github.com/qarmin/czkawka.git "$CZKAWKA_PATH"
cd $CZKAWKA_PATH
git checkout "$NUMBER"
cd "$CZKAWKA_PATH/snap"
snapcraft
snapcraft login
snapcraft upload --release=stable "czkawka_${NUMBER}_amd64.snap"

View file

@ -1,6 +1,6 @@
name: czkawka # you probably want to 'snapcraft register <name>'
base: core18 # the base snap is the execution environment for this snap
version: '2.2.0' # just for humans, typically '1.2+git' or '1.3.2'
version: '2.3.0' # just for humans, typically '1.2+git' or '1.3.2'
summary: Czkawka - fast data cleaner written in Rust # 79 char long summary
description: |
Czkawka is very fast and feature rich cleaner which finds file duplicates, empty folders and files, duplicated music, similar images or the biggest files in selected directories.
@ -15,6 +15,7 @@ parts:
source: https://github.com/qarmin/czkawka.git
build-packages:
- libgtk-3-dev
- libasound2-dev
- curl
- gcc
- git
@ -31,7 +32,7 @@ parts:
apps:
czkawka:
command: bin/czkawka_gui
extensions: [gnome-3-34]
extensions: [gnome-3-28]
plugs:
- home
- removable-media