eupolicy.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
This Mastodon server is a friendly and respectful discussion space for people working in areas related to EU policy. When you request to create an account, please tell us something about you.

Server stats:

217
active users

#caching

0 posts0 participants0 posts today

Quick question for folks who understand HTTP caching on reverse proxies like Squid or Cloudflare. If I have a GET REST endpoint responding with 200 OK and the following headers:

Cache-Control: public, max-age=3600  
ETag: "123-a"

The proxy should cache and serve the response without hitting the underlying server more than once for the first hour, then send a request with If-Match: "123-a" when the cache goes stale, right? Is there any reason why it wouldn’t?

#http#caching#etag

Although frequently misunderstood, the HTTP Cache-Control header is crucial because it specifies caching mechanisms within requests and responses.  In its typical format, it reveals details as to how resources are stored, the location of the resource and the maximum age before expiring…

In our latest blog post, Kieran Larking highlights that the No-cache directive does not prevent caching and looks at typical caching behaviour directives and how to correctly use these directives to balance performance and security: pentestpartners.com/security-b

«#Joblib is a set of tools to provide lightweight pipelining in #Python In particular:

Joblib is optimized to be fast and robust on large data in particular and has specific optimizations for #numpy arrays. It is BSD-licensed.»

joblib.readthedocs.io/en/stabl

joblib.readthedocs.ioJoblib: running Python functions as pipeline jobs — joblib 1.4.2 documentation

@0xF21D@infosec.exchange
Not necessarily when you factor in that
#Mastodon is the absolute worst #Fedi platform on the planet for #admins and server operators.

I'm sure
#Mozilla found out, like most Mastodon operators of large instances, that their #S3 storage #costs ballooned exponentially thanks to Mastodon's abject refusal to turn of media #caching for the whole #fediverse forcing admins and operators to cache and store all media -- from all #instances -- that pass through their instance.

Simply put, Mastodon fucking sucks as a fedi platform for admins and operators and I wouldn't be surprised if you don't see more larger instances folding in the next year.

It's an unsustainable architecture.

RE:
https://infosec.exchange/users/0xF21D/statuses/113636032334271805

Infosec ExchangeRobert [KJ5ELX] :donor: (@0xF21D@infosec.exchange)Hard decision my a$$ https://mozilla.social/@mozilla/113635921087596367
Replied in thread

@gerowen i'd say get a v cheap vps and a domain and then reverse proxy a box to avoid cpu and disk quotas - ddns works if your isp is cool with it but with vps you can run vpn and have access to much faster bandwidth - more than likely. also more privacy - your isp won't see your traffic though you may have to use proxychain to connect to some sites. i have pretty slow upload speeds so the vps is a nice way around that #siege #caching proxy

This week I learned that full html page caching for #SSR #nodejs (without static pre-rendering on build time) needs some custom work.
E.g. setting up a cloudflare worker + cloudflare cache in front - or writing a redis response cache, use varnish, etc.

My naive thought was that this is already baked into #SvelteKit, #NuxtJS, #NextJS, etc.

Guess I'm a bit spoiled by Blitz Cache for CraftCMS (PHP) which just takes care of caching html responses + redirecting 🤓 #caching putyourlightson.com/plugins/bl

PutYourLightsOnBlitz – Intelligent static page caching for lightning-fast sites.Blitz provides intelligent static page caching for creating lightning-fast sites with Craft CMS. It significantly improves a site’s performance by…

Is anyone aware of a more recent alternative to apt-cacher-ng? I would like to run an apt cache on a small local server. What bugs me about apt-cacher-ng is that:

  • It does not support HTTPS (alright, could work around that with reverse proxying)
  • It stores the backend admin password in plain text

(Snooping around in the docs, I realised that all documentation about #caching and #mirroring #apt #repos is fairly outdated. E.g. both #Debian and #Ubuntu docs warn that creating a mirror requires “massive” amounts of storage space, then quantifying that to be around 60 to 80 GB (hihihi).)

An alternative that is mentioned is to use a general purpose caching proxy like squid… I may resort to that.