[-] Agility0971@lemmy.world 6 points 3 days ago

was kicked in the head like this once. Flew a meter into the wall

[-] Agility0971@lemmy.world 1 points 3 days ago

What do you mean that a file deduplication will take forever if there are duplicated directories? That the scan will take forever or that manual confirmation will take forever?

[-] Agility0971@lemmy.world 1 points 3 days ago

That sounds doable. I would however not trust my self to code something bug free on the first go xD

[-] Agility0971@lemmy.world 1 points 4 days ago

This will indeed save space but I don't want links either. I unique files

[-] Agility0971@lemmy.world 1 points 4 days ago

I had multiple systems which at some point were syncing with syncthing but over time I stopped using my desktop computer and syncthing service got unmaintained. I've had to remove the ssd of the old desktop so I yoinked the home directory and saved it into my laptop. As you can probably tell, a lot of stuff got duplicated and a lot of stuff got diverged over time. My idea is that I would merge everything into my laptops home directory, and rather then look at the diverged files manually as it would be less work. I don't think doing a backup with all my redundant files will be a good idea as the initial backup will include other backups and a lot of duplicated files.

[-] Agility0971@lemmy.world -1 points 4 days ago

I did not ask for a backup solution, but for a deduplication tool

81
Deduplication tool (lemmy.world)
submitted 5 days ago* (last edited 3 days ago) by Agility0971@lemmy.world to c/linux@lemmy.ml

I'm in the process of starting a proper backup solution however over the years I've had a few copy-paste home directory from different systems as a quick and dirty solution. Now I have to pay my technical debt and remove the duplicates. I'm looking for a deduplication tool.

  • accept a destination directory
  • source locations should be deleted after the operation
  • if files content is the same then delete the redundant copy
  • if files content is different, move and change the name to avoid name collision I tried doing it in nautilus but it does not look at the files content, only the file name. Eg if two photos have the same content but different name then it will also create a redundant copy.

Edit: Some comments suggested using btrfs' feature duperemove. This will replace the same file content with points to the same location. This is not what I intend, I intend to remove the redundant files completely.

Edit 2: Another quite cool solution is to use hardlinks. It will replace all occurances of the same data with a hardlink. Then the redundant directories can be traversed and whatever is a link can be deleted. The remaining files will be unique. I'm not going for this myself as I don't trust my self to write a bug free implementation.

[-] Agility0971@lemmy.world 15 points 4 weeks ago

The exiting part will be if they launch a passive cooled arm based laptop.

27
submitted 1 month ago* (last edited 1 month ago) by Agility0971@lemmy.world to c/linux@lemmy.ml

I've run passwd and sudo su; passwd to change password for root and my account. Password is set correctly when using sudo and su but whenever I get prompted by pkexec it accepts only the old password. I've rebooted my system to make sure it was not an issue.

Edit: Solved Turns out the password were changed for root account but not my user account. I think the reason is that there are no password quality requirements on root accounts, but there are on the default account in ubuntu. Changing the password from root account passwd user worked fine.

[-] Agility0971@lemmy.world 17 points 3 months ago

Personally I've had more issues tweaking Debian to just work as needed then Arch

[-] Agility0971@lemmy.world 21 points 5 months ago

Meh, screen angle is constant. Not impressed until it supports screens with a constant angular velocity.

[-] Agility0971@lemmy.world 94 points 7 months ago

It says it's scraped and not leaked

82
submitted 8 months ago by Agility0971@lemmy.world to c/privacy@lemmy.ml
8
squad 6.0 (lemmy.world)

has anyone had any success with squad 6.0 yet?

[-] Agility0971@lemmy.world 9 points 10 months ago

Well, if they would not nerf it maybe it woyld not go so much down

[-] Agility0971@lemmy.world 22 points 11 months ago

i disagree with the color of the text. too much contrast. may I suggest it being dark blue?

view more: next ›

Agility0971

joined 1 year ago