🅿🅸🆇🅴🅻

  • 2 Posts
  • 21 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle











  • True, but it depends from person to person and it counts if you have a small or big drive, how often you watch and rotate your media, how large the media is. If you only have a 1TB SSD, and often download and watch blue-ray quality, 20 movies will fill it. It won’t be long until the same blocks get erased, no matter how much the SSDs firmware tries to spread the usage and avoid reusing the same blocks.

    Anyway, my point is, aside from noise and lower power consumption advantages, I wouldn’t use SSDs for a NAS, I regard them as consumables. Speed isn’t really an issue in HDDs.



  • 🅿🅸🆇🅴🅻@lemmy.worldtoSelfhosted@lemmy.worldSSD only NAS/media server?
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    7 months ago

    Failure rates for sdd are better than hdd

    I’m curious on where did you find this. Maybe they have lower DOA rates and decreased chances to fail in the first year, but SSDs have a limited usage lifetime / limited writes, so even if they don’t fail quickly, they wear out over time and at first they have degraded performance, but finally succumb in 5 years or less, even when lightly used (as in as OS drives).

    To avoid DOA / first year issues with HDDs, just have the patience to fully scan them before using with a good disk testing app.


  • From my experience, SSDs are more prone to failure and have limited writes. They are ment for running the OS, databases for fast access, and games / apps. They are not ment for long time storage and frequent overwrites, like movies, which usually means download, delete and repeat which wears the memory quickly. One uses electric current to short memory cells and switch them from 0 to 1 and viceversa, the other uses a magnetic layer which supports a lot more overwrites on the same bit.

    If keeping important data on them, I would use them only in a redundant RAID configuration and/or with frequent backups so I wouldn’t cry if one of them fails. And when they fail, there are no recovery options as with HDDs (even if very expensive, at least you have a chance).

    I also wouldn’t touch used server SSDs, their lifetime is already shortened from the start. I had 3 Intel, enterprise-grade SSD changes in our company servers, each after about 3 years - they just wear out. For consumer / home SSDs the typical lifetime is 5 years, but that takes into account minor / “normal” usage, ie. if used as OS disks. And maybe power users could extend that with moving the swap/pagefile and temporary files (ie browser cache, logs, etc) on a spinning disk, but it defeats the purpose of having an SSD for speed in the first place.

    If you have media (like movies) in mind, you’ll find sooner than later that you’ll need more space, and with HDDs the price per GB is lower than SSDs.

    If you have no issue with 1. noise, 2. speed (any HDD is fast enough for movie playback and are decent for download), 3. concurrent access, or 4. physical shocks from transport, go with HDDs, even used ones.

    My two, personal opinion cents.



  • If on Linux and need automatization, GnuPG works, and you can use RSA keys. It’s slower than symmetric for large files, but I had success encrypting several tens of GB database backups with a 2048 bit key with no issue. The higher key length you go, the slower. But it has the advantage that you only need to keep the public key on the machine you are encrypting on, and keep the private key safely stored away for when you need to decrypt. Unlike for symmetric, when if you need repeatable / automatized encryption, and you’d store the key in a config somewhere on the same machine in plain sight, and because it’s also used for decryption, when leaked you’re done for.

    Normally you would go with symmetric and generate a good, random AES key each time you encrypt, use AES for actual encryption which is very FAST, and encrypt just the AES key with RSA / asymmetric. This complicates scripts a lot and you end up with 2 dependent files to take care of (the target encrypted file and the file with the encrypted AES key). But this is the sane way of doing it because asymmetric isn’t ment for large data lengths (not just because of slow speed). HTTPS and SSH work the same way: asymmetric for key exchange handshake (through public certificates), symmetric for the actual communication while oftenly changing the key.

    If no automation is necessary, use VeraCrypt containers. You can keep multiple files in a container. You have several symmetric algorithms to pick from and you can control the number of iterations for key derivation. Debatable as to the added security, but you can also choose to chain up to 3 algorithms in your preferred order.

    The above covers the tools and somewhat the algos. For key lengths, see here. I wouldn’t go with RSA lower than 4096 these days, elliptic curves is preffered (256 bit +), or AES 256+ in CTR mode. And I’d stay away from lesser known / scrutinized algos.

    As others have stated, any recommendation depends on your threat model, how powerful and resourceful are the bad actors you are trying to protect from, how often you need to encrypt, how often to decrypt, the time span for which you need to protect the file, etc.



  • Reading as a kid about virus analysis and how they work in a short column in a… newspaper. Yeah, they even listed full Windows Registry paths. Didn’t know what HKEY_LOCAL_MACHINE was, didn’t own a computer, only knew about some DOS commands, but I knew I wanted to be able to do that job and decompile stuff (whatever that ment) and see how it worked. Just like dismantling (and ultimately destroying) toys to see the inner workings.

    After finally owning a computer and being bored by the few games I had on Windows 95, being limited to Notepad, Internet Explorer (without an internet connection yet; or was it Netscape Navigator?) and Paint (in which I sucked, lacking any artistic talent), when I learned that I can just type stuff in Notepad, I borrowed a book about “programming” in HTML. Then Pascal, the pinnacle being a simple XOR encryption program, with a god damn white on blue “windows” interface with buttons (a la Midnight Commander). Writing TRIVIA “scripts” for mIRC channels made us gods. Then Delphi naturally followed, making my own tool to track how many hours I’ve spent on dialup a month (yes, internet was very expensive) while listening to 80’s music on Winamp. Nothing was more interesting than that. Then got a job and out of a sudden started making my own money by writing Delphi code. Up until then I wasn’t really aware that my passion would also bring food on the table. The rest is history.

    Programming in those days felt unreal. Felt like The Matrix. I knew that what I want to do for the rest of my life is look at text on a screen, hit CTRL+F9, see a crash, set some breakpoints, and ponder around the room or while taking a piss about what went wrong and how to solve it. I’m no Einstein, but I understood why science people dedicate their lifes to their work and disregard completely their social life.