fenndev

joined 10 months ago
[–] fenndev@leminal.space 15 points 4 months ago

I don't think one currently exists, but it would be an interesting project. There are plenty of trackers for CVEs but in terms of project ethics, acquisitions, etc., there's a space for it.

The two main problems I can see are:

  1. How do you define 'negative'? An open source application being acquired is often a bad thing, but not always. An acquisition by FUTO is more likely to be viewed positively than an acquisition by Microsoft, but either can be interpreted positively or negatively depending on the person.

  2. Community involvement is absolutely critical. If I were running a service like this (for example), I would only really be keeping up on the services I use and care about. I would need others to submit info and then verify it.

[–] fenndev@leminal.space 1 points 4 months ago (1 children)

Sorry, I should clarify. I'm hoping to possibly have a setup like this:

  1. Browser makes a request to an eepsite
  2. The router sees the request is to a domain ending in .i2p and forwards the request to a service running on the router
  3. That service then performs the necessary encryption and establishes connection with the I2P network.

I'd imagine it's a similar process for other protocols and networks. No idea if this is possible or desirable.

 

TL; DR: Is it possible (and if so, desirable) to configure my OPNsense router to handle non-standard traffic instead of needing to configure each client device manually? Examples of what I mean by 'non-standard traffic' include Handshake, I2P, ZeroNet, and Tor.

[–] fenndev@leminal.space 4 points 4 months ago (5 children)

Any issues lately with your network? When DNS is down or having issues, Firefox and forks take forever to start up.

[–] fenndev@leminal.space 7 points 5 months ago* (last edited 5 months ago)

I hope eventually we get an ARM-powered Framework.

Bought a Framework shortly after Linus Techmin joined forces with them. Was stolen out of my partner's car a few months later and just haven't been able to justify (or afford) a replacement.

[–] fenndev@leminal.space 3 points 5 months ago (1 children)

Oh. You're right. That worked. I feel really silly that I missed that.

Thank you so much!

[–] fenndev@leminal.space 2 points 5 months ago (3 children)

I have both web and websecure set up as entrypoints.

8
submitted 5 months ago* (last edited 5 months ago) by fenndev@leminal.space to c/selfhosted@lemmy.world
 

Edit: Thanks for the help, issue was solved! Had Traefik's loadbalancer set to route to port 8081, not the internal port of 80. Whoops.

Intro

HI everyone. I've been busy configuring my homelab and have run into issues with Traefik and Vaultwarden running within Podman. I've already successfully set up Home Assistant and Homepage but for the life of me cannot get things working. I'm hoping a fresh pair of eyes would be able to spot something I missed or provide some advice. I've tried to provide all the information and logs relevant to the situation.

Expected Behavior:

  1. Requests for *.fenndev.network are sent to my Traefik server.
  2. Incoming HTTPS requests to vault.fenndev.network are forwarded to Vaultwarden
    • HTTP requests are upgraded to HTTPS
  3. Vaultwarden is accessible via https://vault.fenndev.network and utilizes the wildcard certificates generated by Traefik.

Quick Facts

Overview

  • I'm running Traefik and Vaultwarden in Podman, using Quadlet
  • Traefik and Vaultwarden, along with all of my other services, are part of the same fenndev_default network
  • Traefik is working correctly with Home assistant, Adguard Home, and Homepage, but returns a 502 Bad Gateway error with Vaultwarden
  • I've verified that port 8081 is open on my firewall and my service is reachable at {SERVER_IP}:8081.
  • 10.89.0.132 is the internal Podman IP address of the Vaultwarden container

Versions

Server: AlmaLinux 9.4

Podman: 4.9.4-rhel

Traefik: v3

Vaultwarden: alpine-latest (1.30.5-alpine I believe)

Error Logs

Traefik Log:

2024-05-11T22:09:53Z DBG github.com/traefik/traefik/v3/pkg/server/service/proxy.go:100 > 502 Bad Gateway error="dial tcp 10.89.0.132:8081: connect: connection refused"

cURL to URL:

[fenndev@bastion ~]$ curl -v https://vault.fenndev.network
*   Trying 192.168.1.169:443...
* Connected to vault.fenndev.network (192.168.1.169) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
*  CAfile: /etc/pki/tls/certs/ca-bundle.crt
* TLSv1.0 (OUT), TLS header, Certificate Status (22):

Config Files

vaultwarden.container file:

[Unit]
Description=Password 
After=network-online.target
[Service]
Restart=always
RestartSec=3

[Install]
# Start by default on boot
WantedBy=multi-user.target default.target

[Container]
Image=ghcr.io/dani-garcia/vaultwarden:latest-alpine
Exec=/start.sh
EnvironmentFile=%h/.config/vault/vault.env
ContainerName=vault
Network=fenndev_default

# Security Options
SecurityLabelType=container_runtime_t
NoNewPrivileges=true                                    
# Volumes
Volume=%h/.config/vault/data:/data:Z

# Ports
PublishPort=8081:80

# Labels
Label=traefik.enable=true
Label=traefik.http.routers.vault.entrypoints=web
Label=traefik.http.routers.vault-websecure.entrypoints=websecure
Label=traefik.http.routers.vault.rule=Host(`vault.fenndev.network`)
Label=traefik.http.routers.vault-websecure.rule=Host(`vault.fenndev.network`)
Label=traefik.http.routers.vault-websecure.tls=true
Label=traefik.http.routers.vault.service=vault
Label=traefik.http.routers.vault-websecure.service=vault

Label=traefik.http.services.vault.loadbalancer.server.port=8081

Label=homepage.group="Services"
Label=homepage.name="Vaultwarden"
Label=homepage.icon=vaultwarden.svg
Label=homepage.description="Password Manager"
Label=homepage.href=https://vault.fenndev.network

vault.env file:

LOG_LEVEL=debug
DOMAIN=https://vault.fenndev.network 
[–] fenndev@leminal.space 10 points 6 months ago (1 children)

If you want your environment to be consistent between desktops, keep it mostly stock. The default KDE themeing and setup is pretty damn similar to Windows 10, and I've kept it stock ever since I started using it ~1 ½ years ago.

[–] fenndev@leminal.space 1 points 6 months ago* (last edited 6 months ago) (1 children)

Yeah, sure thing.

Abominable
Ad Astra
Aladdin (Live Action)
Aladdin
Alien - Covenant
Alien
Ant-Man and the Wasp
Ant-Man
Aquaman
Arrival
Austin Powers In Goldmember
Austin Powers - International Man of Mystery
Austin Powers - The Spy Who Shagged Me
Avengers - Endgame
Avengers - Infinity War
Baby Driver
Batman and Robin
Batman Forever
Batman
Batman Returns
Big Hero 6
Black Panther
Bohemian Rhapsody
Brave
Captain Amercia - Civil War
Captain America - The First Avenger
Captain America - The Winter Soldier
Captain Marvel (2019)
Cars 2
Cars 3
Cars
Cinderella (Live Action)
Close Encounters of the Third Kind 40th Anniversary Edition
Deadpool 2
Deadpool
Detective Pikachu
Doctor Sleep
Doctor Strange
Everything Everywhere All At Once
Fantastic Beasts - The Crimes of Grindelwald
Finding Dory
Finding Nemo
First Man
From Up on Poppy Hill
Frozen II
Frozen
Gemini Man
Godzilla (1998)
Godzilla - King of the Monsters
Gremlins
Guardians of the Galaxy Vol 1
Guardians of the Galaxy Vol 2
Halloween
Harry Potter and the Chamber of Secrets
Harry Potter and the Deathly Hallows - Part 1
Harry Potter and the Deathly Hallows - Part 2
Harry Potter and the Goblet of Fire
Harry Potter and the Half-Blood Prince
Harry Potter and the Order of the Phoenix
Harry Potter and the Prisoner of Azkaban
Harry Potter and the Sorcerer's Stone
Howl's Moving Castle
Incredibles 2
Incredibles
Inside Out
Interstellar
Iron Man 2
Iron Man 3
Iron Man
IT (2018)
John Wick 3 - Parabellum
John Wick - Chapter 2
John Wick
Joker (2019)
Jumanji (1995)
Jurassic Park III
Jurassic Park
Jurassic Park - The Lost World
Jurassic World - Fallen Kingdom
Jurassic World
Kiki's Delivery Service
Kingsman - The Golden Circle
Kingsman - The Secret Service
Logan
Maleficent - Mistress of Evil
Maleficent
Mary Poppins Returns
Moana
Monsters, Inc
Monsters University
My Neighbor Totoro
Pacific Rim Uprising
Paprika
Ponyo
Ratatouille
Robin Hood
Spider-Man - Homecoming
Spider-Man - Into the Spider-Verse
Spirited Away
Starship Troopers
Star Trek - Beyond
Star Trek - Into Darkness
Star Trek (2009)
Tangled
The Cat Returns
The Expendables 2
The Expendables 3
The Expendables
The Good Dinosaur
The Hulk
The Jungle Book
The Karate Kid (1984)
The Lion King (Live Action)
The Lion King
The Little Mermaid
The Matrix
The Matrix -  Reloaded
The Matrix -  Revolutions
The Predator
The Princess and the Frog
The Secret Life of Pets 2
The Secret World of Arrietty
The Shining
The Wizard of Oz
Thor
Thor - The Dark World
Toy Story 2
Toy Story 3
Toy Story 4
Toy Story
Turning Red
Up
Us
Venom (2018)
Wall-E
Waterworld (2019)
Wonder Woman
Wreck-It Ralph
Zootopia
[–] fenndev@leminal.space 5 points 6 months ago (1 children)

AV1 is definitely what I'd like to do. I'm not aiming for maximum compatibility; small file size and high quality encodes are my goal. I can transcode if needed.

[–] fenndev@leminal.space 12 points 6 months ago (1 children)

Mhm, I'm aware. I just figured the nice folks here would likely have more experience with codecs and such than elsewhere!

(That, and, if I can build my own replacement Disney+, I would definitely want to share with friends.)

[–] fenndev@leminal.space 3 points 6 months ago (2 children)

The issue is storage costs. Currently they (and some Blu-ray shows I ripped) are taking up just over 12TB. I bought all of these movies when I had money to spend on stuff like that, but money is short and times are tough. "Storage is cheap" but my wallet is cheaper right now, aha.

 

cross-posted from: https://leminal.space/post/6179210

I have a collection of about ~110 4K Blu-Ray movies that I've ripped and I want to take the time to compress and store them for use on a future Jellyfin server.

I know some very basics about ffmpeg and general codec information, but I have a very specific set of goals in mind I'm hoping someone could point me in the right direction with:

  1. Smaller file size (obviously)
  2. Image quality good enough that I cannot spot the difference, even on a high-end TV or projector
  3. Preserved audio
  4. Preserved HDR metadata

In a perfect world, I would love to be able to convert the proprietary HDR into an open standard, and the Dolby Atmos audio into an open standard, but a good compromise is this.

Assuming that I have the hardware necessary to do the initial encoding, and my server will be powerful enough for transcoding in that format, any tips or pointers?

 

I have a collection of about ~110 4K Blu-Ray movies that I've ripped and I want to take the time to compress and store them for use on a future Jellyfin server.

I know some very basics about ffmpeg and general codec information, but I have a very specific set of goals in mind I'm hoping someone could point me in the right direction with:

  1. Smaller file size (obviously)
  2. Image quality good enough that I cannot spot the difference, even on a high-end TV or projector
  3. Preserved audio
  4. Preserved HDR metadata

In a perfect world, I would love to be able to convert the proprietary HDR into an open standard, and the Dolby Atmos audio into an open standard, but a good compromise is this.

Assuming that I have the hardware necessary to do the initial encoding, and my server will be powerful enough for transcoding in that format, any tips or pointers?

[–] fenndev@leminal.space 11 points 6 months ago

It is not an abuse of anyone's creative rights to the convert music from a game you legally own to a different format.

 

cross-posted from: https://leminal.space/post/4761745

Shortly before the recent removal of Yuzu and Citra from Github, attempts were made to back up and archive both Github repos; it's my understanding that these backups, forks, etc. are fairly incomplete, either lacking full Git history or lacking Pull Requests, issues, discussions, etc.

I'm wondering if folks here have information on how to perform thorough backups of public, hosted git repos (e.g. Github, Gitlab, Codeberg, etc.). I'd also like to automate this process if I can.

git clone --mirror is something I've looked into for a baseline, with backup-github-repo looking like a decent place to start for what isn't covered by git clone.

The issues I can foresee:

  • Each platform builds its own tooling atop Git, like Issues and Pull Requests from Github
  • Automating this process might be tricky
  • Not having direct access/contributor permissions for the Git repos might complicate things, not sure

I'd appreciate any help you could provide.

 

cross-posted from: https://leminal.space/post/4761745

Shortly before the recent removal of Yuzu and Citra from Github, attempts were made to back up and archive both Github repos; it's my understanding that these backups, forks, etc. are fairly incomplete, either lacking full Git history or lacking Pull Requests, issues, discussions, etc.

I'm wondering if folks here have information on how to perform thorough backups of public, hosted git repos (e.g. Github, Gitlab, Codeberg, etc.). I'd also like to automate this process if I can.

git clone --mirror is something I've looked into for a baseline, with backup-github-repo looking like a decent place to start for what isn't covered by git clone.

The issues I can foresee:

  • Each platform builds its own tooling atop Git, like Issues and Pull Requests from Github
  • Automating this process might be tricky
  • Not having direct access/contributor permissions for the Git repos might complicate things, not sure

I'd appreciate any help you could provide.

14
submitted 7 months ago* (last edited 7 months ago) by fenndev@leminal.space to c/opensource@lemmy.ml
 

Shortly before the recent removal of Yuzu and Citra from Github, attempts were made to back up and archive both Github repos; it's my understanding that these backups, forks, etc. are fairly incomplete, either lacking full Git history or lacking Pull Requests, issues, discussions, etc.

I'm wondering if folks here have information on how to perform thorough backups of public, hosted git repos (e.g. Github, Gitlab, Codeberg, etc.). I'd also like to automate this process if I can.

git clone --mirror is something I've looked into for a baseline, with backup-github-repo looking like a decent place to start for what isn't covered by git clone.

The issues I can foresee:

  • Each platform builds its own tooling atop Git, like Issues and Pull Requests from Github
  • Automating this process might be tricky
  • Not having direct access/contributor permissions for the Git repos might complicate things, not sure

I'd appreciate any help you could provide.

view more: next ›