I am working on moving most of our software stack to some servers running in upcloud to achieve a better uptime . There is however a few services which for different reasons cannot run outside of our company network e.g. a services which controls a cd burning robot and a huge amount of audiofiles (~100TB) which we simply aren’t ready to move.
I am thinking of these options:
-
exposing access to these with nginx and a whitelist of ip addresses
-
setup a ssh bastion or some vpn connection giving access to internal network
Do I have other better options or any of the above makes any sense?
Given that one of those reasons is likely to be “security”, I would absolutely avoid opening ports from your on-prem to the cloud.
A VPN tunnel is probably the easiest to implement, (either an openVPN server hosted cloud-side, or a peer to peer vpn like zerotier), but effectively gives full access to the on-prem network at point of ingress.
Depending on your router/network configuration, you might be able to handle the ingress at the edge of your network and control traffic with firewall rules and run the local VPN client on your firewall (I do this with my OPNSense implementation), otherwise I would do a blend of options 1 and 2 and have a dedicated proxy box that runs the VPN client as well as the appropriate reverse proxy to reach the target servers.