Like the title says, I’m new to self hosting world. 😀 while I was researching, I found out that many people dissuaded me to self host email server. Just too complicated and hard to manage. What other services that you think we should just go use the currently available providers in the market and why? 🙂thank you
E-mail
Okay I understand that email hosting is bad for SENDING email , but what about only RECEIVING email , isn’t it a good idea to keep my stuff private ? I rarely send personal emails, and like to avoid my data being used for marketing purposes Is that bad to have smtp imap open on dynamic ip address ? Just asking your opinion
I’m doing exactly that, and it works like a charm. Get a DynDNS, backup mx and SMTP relay and you’re good, or get a domain provider like strato.de that already includes all three with the domain.
Spam is also manageable. I get maybe 1-2 per day that make it past the filter, and I do have to add some custom keyword filters from time to time, but that’s about it. Fetching updated filter lists and self-learning from past errors keeps the filter up to date and is completely automated.
Antispam is hell, just saying
Antispam is easy with a mix of greylisting and spamassassin
A password manager because if anything goes wrong, you’ll be completely screwed.
What you SHOULD absolutely self host though is a password manager, so you can be in control of your most sensitive data.
Regarding email, I think everyone should absolutely self host it, but it’s less and less viable in this google/Microsoft duopoly world. But ideally everyone would self host it. The reason why people advise against it really comes down to lack of real competition, and the two tech giants dictating how we violate every RFC possible.
A password manager because if anything goes wrong, you’ll be completely screwed.
What you SHOULD absolutely self host though is a password manager, so you can be in control of your most sensitive data.
Wot?
In my opinion, cloud storage for (zero knowledge) backup. Your backup strategy should include a diversity of physical locations. I had a house fire a few years ago. Luckily, my data drives survived, but if they hadn’t, my cloud backup would’ve been invaluable.
Passwords:
-> You want to have immediat access to them, even if your house burns downNotes:
-> You want to be able to read the documentation how to fix your selfhosted service, even when your selfhosted services are downPublic Reverse proxy:
-> A reverse proxy is only as safe as the applications behind. And NO, most selfhosted-applications are not hardened or had security audits
(reverse proxy with a forward authentication proxy is something different)Don’t host your own email server.
Just trust me.
Meh, been doing it for 5 years now with minimal issues. Had one issue come up where my domain was flagged as malicious, but was solved in a few days and some emails to security vendors.
I think it’s important that those who can, and are educated enough to keep it running properly do host their own. Hosting your own email should be encouraged if capable because it helps reduce the monopoly, and keep a little bit of power for those who want to retain email privacy.
I did for years quite successfully. Ultimately blocklists did me in however - I don’t have the knowledge to resolve those timely and it became a headache I couldn’t tolerate at that time.
I agree with KN4MKB. I’ve been hosting my own mail server for decades. Not one issue. I use that in lieu of a mail service provider (Google immediately comes to mind), as their EULA service agreement will tell you that - since you’re using their service, on their servers - anything goes. Read the fine print on Gmail, and you’ll see. 😉
I did it anyway some time ago and I’m really happy with it. I’m using my own email addresses for absolutely anything by now.
They are not hard to setup, easy to keep running (once going they pretty much just work). If you follow the right steps you can avoid being undeliverable and keep people from abusing your sending server (as a relay).
Why?
People saying email, look into using external SMTP servers as relays. Your domain most likely comes with at least one email account with SMTP access. You can use that as a relay to send personal/business emails from your server using the provider’s reputable IP addresses.
Don’t self-host email SMTP or public DNS. They’re hard to set up properly, hard to maintain, easy to compromise and end up used in internet attacks.
Don’t expose anything directly to the internet if you’re not willing to constantly monitor the vulnerability announcements, update to new releases as soon as they come out, monitor the container for intrusions and shenanigans, take the risk that the constant updates will break something etc. If you must expose a service use a VPN (Tailscale is very easy to set up and use.)
Don’t self-host anything with important data that takes uber-geek skills to maintain and access. Ask yourself, if you were to die suddenly, how screwed would your non-tech-savvy family be, who can’t tell a Linux server from a hot plate? Would they be able to keep functioning (calendar, photos, documents etc.) without constant maintenance? Can they still retrieve their files (docs, pics) with only basic computing skills? Can they migrate somewhere else when the server runs down?
lol
Also, check out “ciphermail”. It’s end-to-end encryption mail server.
I’d say backups. At least it shouldn’t be only local. I follow the rule of threes: two local copies and one off site with backblaze. Yeah, it ties up a not insignificant amount of disk space I could use for other things, but dammit, I’m not loosing my wedding photos, important system configurations, etc.
Primary backups
Password manager. While some may cache on your client devices, by and large if your server goes down, no passwords.
Vaultwarden with SyncThing is a robust combo from what I hear. Everything is local.
Not necesarily. If you self host SyncThing and use it to synchronise your password database across devices (for example KeePassXC’s .kdbx file) only the synchronisation goes down with your server.
Same with Bit/vaultwarden, all clients grab a copy of the vault from the server when they sync so if the server is offline all clients still “just work”.
Vaultwarden is perfect for that then, it does cache locally.
Email
Clearly opening RDP port on internet. NEVER.
I have a load balancer on my network that has opened one port on my home network. The load balancer is connected over the cloud flare and is encrypted on both sides. Is that okay?
Why you chose to open a port, if you use cloudflare? Couldn’t you use cloudflare tunnel in that case?
What do you mean by “clearly”. Open RDP without password protection?
I often use RDP to access my desktop Windows 10.
The password isn’t enough. It’s not a hardened protocol and vulnerabilities are found in it with some regularity. There have been unauthenticated RCEs before, ie nightmare scenario.
Those vulnerabilites come from humans clicking on files they’re not supposed to click on. NO way of communication is secure against that. Not even the magic of Tailscale. RDP offers 2FA and has an encrypted connection. It’s fine!
Even Microsoft recommends against opening rdp to the web and to use a VPN instead.
You’re playing with fire here.
Microsoft recommends against opening rdp to the web
As far as a few google searches got me: No, they don’t.
Lol, I work at an attack surface scanning company. Every freaking company I talk to, with very few exceptions, has at least one of these. If not a whole infrastructure. Then they cry, “how did we get ransomware?”
Don’t try to be clever and change the port from 3389 to something else either
Scanners can fingerprint traffic and just blast the other ports instead
I (foolishly) did this a few years ago and luckily I had account lockout enabled
Constant attempts all day long - they were even able to enumerate local users and try to log in as them (fortunately they never could cause the passwords were random keepass ones)
Don’t do it, seriously
Psa for you guys that rdp over the net, turn that off, and use a VPN like wire guard or tail scale, or use something like apache guacamole.
What is wrong with that? Don’t they still need correct credentials to connect?
The service itself is insecure. You need to hide it behind a more secure setup if you want to expose it to the internet. It’s been a long while since I tried, but I have some foggy memories of an RDP Server that would encapsulate the connection in an SSL tunnel and forward the connection to the remote machine rather than exposing the RDP client itself to the internet.
Definitely do your research on how to do it securely before you just set it up and open it to the wild.
VPN FTW
Oh sure, VPN is definitely the preferred way if you already have the infrastructure in place. My experience with the front-end RDP server was years ago as the sysadmin for a company. My experience is likely very out of date, and was very corporate-focused, rather than for an enthusiast.
Nowadays I try not to touch Windows, and haven’t used RDP in years.
These days there are so many bots scanning that you have to be so careful.
If self hosting from home… email servers
At home, your IP is likely blacklisted and/or your provider has blocked the necessary ports. Not to mention the layers of potential headaches dealing with potential spam block dbs, especially if you don’t own your IP.
You can of course do custom setups allowing you to skirt these restrictions, but can sometimes be a bit complicated and typically involve non-traditional customizations.
The login page to your NAS.
If your NAS is properly updated, and SSL is used, then the login screen it just as safe as any other web app with regular updates. I would ask why someone would want that.
It’s not. SSL in itself doesn’t make any exposed service safe, just safer. An updated service isn’t necessarilu free of vulnerabilities.
The difference between exposing your login page and most other services is the attack surface. If someone gets into your NAS administration, game over. You’re getting hit with ransomware or worse.
If someone gets into my Calibre Web server, for instance, my vulnerability is much more limited. That runs in a docker container that only has access to the resources and folders is absolutely needs. The paths to doing harm to anything besides my ebook library are limited.
I of course still use SSL, with my Calibre Wev behind a reverse proxy, with long complex passwords, and I’ll probably soon move it to an OATH login where I can use MFA (since it doesn’t support it natively itself). And there are more measures I could take beyond that, if I chose.
I’ll leave with this. ANY service exposed publicly or not should not have vulnerabilities. If there is any hint that your NAS webserver has vulnerabilities, it shouldn’t even be used internally. So to me, it does not matter. I don’t expose my NAS webserver because I have no reason to increase my attack surface that wide.
But I’m comfortable exposing any of my internal services as needed because I’ve personally checked the source code for vulnerabilities, and have proper checks in place on top of regular security updates. I understand why others wouldn’t think the same way, as this takes a high level of confidence in your ability to assess the security posture of your systems and network. I’ve had penetration tests in my network, conduct them myself for business.
It would be nice if we, and apps’ developers, always knew what the vulnerabilities are. They generally exist because the developer doesn’t know about them yet, or hasn’t found a solution yet (though ideally has been transparent about that). Zero-day exploits happen. There’s always a first person or group discovering a flaw.
If being up to date and using SSL was all it took, security would be a lot simpler.
No one security measure is ever foolproof, other than taking everything offline. But multiple used in tandem make it somewhere between inconveniently and impractically difficult to breach a system.
Not really an option when I’m providing file hosting services to a bunch of my friends.