Nice photo! Where was it taken?
Nice photo! Where was it taken?
I’m all for it. Fight fire with fire.
I really hate when companies do that kind of crap. I just imagine a little toddler stomping around going “No! No! Nooo!”
Is there a way to host an LLM in a docker container on my home server but still leverage the GPU on my main PC?
I really disagree that this is the “real issue”. Love it or hate it Twitter is still one of the biggest social media sites out there. In this day in age to expect any prospective Presidential candidate not to have a major presence on a platform like that is just silly.
I don’t use Twitter and don’t care for it, but seriously come on.
You’re not wrong about how important those keys are and how he definitely should have known better. But I at least have a little sympathy for the guy. Everyone makes mistakes from time to time, even with important stuff. Hopefully they are lucky enough not to lose 40k on one but unfortunately he wasn’t. Whether he should have known better or not, that just plain sucks.
I have a similar setup except I use pfSense as my router and pihole for DNS, but I’m sure you can get the same results with your setup. I’m running HAProxy for my reverse proxy and configs for each of my docker containers so any traffic on 443 or 80 gets sent to the container IP on whatever unique port it uses. I then have DNS entries for each URL I want to access the container by, with all of those entries just pointing to HAProxy. Works like a charm.
I have HAProxy running on the pihole itself but there’s no reason you couldn’t just run that in it’s own container. pfSense also let’s you install an HAProxy package to handle it on the router itself. I don’t know if opensense supports packages like that though.
You can even get fancy and do SSL offloading to access everything over HTTPS.