• 11 Posts
  • 784 Comments
Joined 4 years ago
cake
Cake day: January 21st, 2021

help-circle


  • The short answer is that Docker (and other containerization technologies) share the Linux kernel with the host. The Linux kernel is very complicated and shouldn’t be trusted to be vulnerability free. Exploitable bugs are regularly discovered in the Linux kernel (and Windows and Darwin). No serious companies separate different tenets with just container technology. Look at GCP, AWS, DigitalOcean… they all use hardware virtualization which is much simpler and much more likely to be secure (but even then bugs are found on occasion).

    So in theory it is secure, but it is just too complex to rely on. I say that docker is good for “mostly trusted” isolation. Different organizations in the same companies, different software that isn’t actively trying to be malicious. But shouldn’t be used to separate different untrusted parties.


  • kevincox@lemmy.mltoSelfhosted@lemmy.worldMini pc arriving tomorrow
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    6
    ·
    6 days ago

    IMHO Arch is actually a great choice. They do have a minimum update frequency you need to maintain (I don’t recall exactly, I think it is somewhere between 1 and 3 months) but if you do, and read the news before updates (and you are usually fine if you don’t, usually the update will just refuse to run until you intervene) things are pretty seamless. I had many arch machines running for >5 years with no issues and no reason to expect that it would change. This is many major version updates for other distros which are often not as seamless.

    That being said I am on NixOS now which takes this to the next level, I am running nixos-unstable but thanks to the way NixOS is structured I don’t need to worry about any legacy cruft accumulating from the many years of updates.

    And after all of that I don’t think it really matters. I think any major distro you pick, weather stable, release-based or LTS will be fine. They all have some sort of update path these days. (unlike in the past where some distros just recommended a re-install for major updates).


  • Only if they gain possession when the device is running with the drive decrypted and they keep it running the whole time. That is a lot higher bar then being able to turn the machine on at any time and then recover the key. For example if this is a laptop that you are flying with. Without auto-decryption you can simply turn it off and be very secure. With auto-decryption they can turn it on then extract the key from memory (not easy, but definitely possible and with auto-decryption they have as long as they need, including sending the device to whatever forensics lab is best equipped to extract the key).


    1. Wiping the drive is a lot easier, just overwrite the root key a few times.
    2. If you store the key on a different drive you can safely dispose of the drive just by separating the two. (I do on my home server, keeping the decryption key on a USB drive. If I need to ship the server or discard old hardware I can just hold onto the thumb drive and not worry about the data being read.)

    Security is always about tradeoffs. On my home server unattended reboots are necessary so it needs to auto-decrypt. But using encryption means I don’t need to worry about discarding broken hardware or if I need to travel with the server were it may be inspected. For my laptop, desktop and phone where I don’t need unattended reboots I require the encryption key on bootup.




  • That’s true. And I’m not saying B2 is bad, it is just something that you should be aware of.

    Their automatic replication isn’t quite as seamless as GCS or S3 though. For example deletes aren’t replicated so you will need a cleanup strategy. Plus once you 2x or 3x the price B2 isn’t as competitive on price. My point is that it is very easy to compare apples to oranges looking at cloud storage providers and it is important to be aware.

    For me B2 is a great fit and I am happy with it, but I don’t wan to mislead peope.


  • I think it depends on your needs. IIUC their storage is “single location”. Like a very significant natural disaster could take it offline or maybe even lose it. Something like S3 or Google Cloud Storage (depending on which durability you select) is multi-location (as in significantly distinct geographical regions). So still very likely that you will never lose any data, but in the extreme cases potentially you could.

    If I was storing my only copy of something it would matter a lot more (although even then you are best to store with multiple providers for social reasons, not just technical) but for a backup it is fine.










  • Robot vacuum cleaners aren’t great a cleaning, but they are very effective at keeping the dust down. You will still want to clean occasionally but with a robot vacuum running regularly you can do it much less often and the house feels cleaner in the meantime.

    I’m also lucky enough to be able to afford house cleaners now. It is such a nice gift to our family to not have to worry about doing these things. We can spend that time doing stuff together rather than cleaning and we don’t think about how dirty the house is and dread cleaning it nearly as often. If you can afford it I would highly recommend it. It definitely isn’t cheap but many people have more expensive habits that bring less joy IMHO.


  • there will be scaling with all of its negative consequences on perceived quality

    In theory this is true. If you had a nice high-bitrate 1080p video it may look better on a 1080 display than any quality of 1440p video would due to loss while scaling. But in almost all cases selecting higher resolutions will provide better perceived quality due to the higher bitrate, even if they aren’t integer multiples of the displayed size.

    It will also be more bandwidth efficient to target the output size directly. But streaming services want to keep the number of different versions small. Often this will already be >4 resolutions and 2-3 codecs. If they wanted to also have low/medium/high for each resolution that would be a significant cost (encoding itself, storage and reduction in cache hits). So they sort of squish the resolution and quality together into one scale, so 1080p isn’t just 1080p it also serves as a general “medium” quality. If you want “high” you need to go to 1440p or 2160p even if your output is only 1080.