> in the sense that against 200 million in revenue, 50,000 and 30 are in the same ballpark
I don't understand how those are in the same ballpark? I thought saying something is in the same ballpark suggested that they are similar in scale, and the implication is that little-leauge does not play in the same ballpark as a NBA team. They are in the same category (baseball), but not at all the same level.
At a big enough scale, previously large differences are effectively 0.
50k/mo is 600,000/yr vs 360/yr at 30/mo. Thats existential for a 1MM/yr company. Neither register on a balance sheet for a 1B/yr company. They are both closer to 0 than being a major cost.
But saying that 200 million and 30 are in the same ballpark is not true in 99.99% of contexts.
Even 50k and 30 I would not say are in the same ballpark. I've worked for major corps and of course a cost saving of 50k/month would not register for the overall company but it probably would for my team. A saving of 30/month is probably not worth spending any considerable amount of time on in most non-personal contexts.
Well, https://jmail.world/jacebook-logo.png is 670KB by itself and loaded on initial load, so I think they might have blown your suggested traffic budget and still have some optimization to do.
Because inexplicably, there's random pixel-level noise baked into the blue area. You can't see it unless you crank up contrast, but it makes the bitmap hard to compress losslessly. If you remove it using threshold blur, it doesn't change the appearance at all, but the size is down to 100 kB. Scale it down to a more reasonable size and you're down to 50 kB.
None of this is due to "modern web development". It's just about a dev not checking reasonable asset size before deploying/compiling, that has happened in web, game-dev, desktop apps, server containers, etc. etc.
This should be an SVG (a few kb after proper compression) or if properly made as a PNG it'd probably be in 20-ish kb.
The dev not having the common sense to check file size and apparently not realising that the PNG format was being grossly misused for this purpose (by not even having a single tone of white for the J and the corners, let alone for the blue background) is modern web development.
What is that noise actually? It's clearly not JPEG artifacts. Is it dithering from converting from a higher bitdepth source? There do appear to be very subtle gradients.
I would bet it's from AI upscaling. The dark edges around high contrast borders, plus the pronounced and slightly off-colour antialised edges (especially visible on the right side of the J) remind me of upscaling models.
If it's large enough for say 2x or more "retina" usage... a very minor soften filter, then color reduction to get well under 256 colors (often can do 32-64) quantization and oxipng w/ zopfli can go a long way... just getting down to a palette over rgb for png brings down sizes a lot... palette reduction to around 32 colors does a bit too. Just depends on the material.
That said, the actual size of this image is massive compared to where it needs to be, and looking at it, should definitely be able to quantize down to 32-64 colors and reduce the size to even 4x the render size... let alone just using svg, as others have mentioned.
Oh, ding ding! Opening in a hex editor, there's the string "Added imperceptible SynthID watermark" in an iTXt chunk. SynthID is apparently a watermark Google attaches to its AI-generated content. This is almost certainly the noise.
Fair enough, I just loaded some pages and some of them are even bigger than 2MB. But then again those static resources would be cached client-side. So unless you have 450 million unique visitors who only ever go to one URL on your site, you are looking at significantly less per pageview. I reloaded the frontpage with caching enabled and it was ~ 30kB of data transfer.
There might be some confusion here, as there is no refusal at all.
As stated in the blog post, we (Prosody) have been accepting (only) serverAuth certificates for a long time. However this is technically in violation of the relevant RFCs, and not the default behaviour of TLS libraries, so it's far from natural for software to be implementing this.
There was only one implementation discovered so far which was not accepting certificates unless they included the clientAuth purpose, and that was already updated 6+ months ago.
This blog post is intended to alert our users, and the broader XMPP community, about the issue that many were unaware of, and particularly to nudge server operators to upgrade their software if necessary, to avoid any federation issues on the network.
Sorry, I probably misunderstood, I thought there were resistance to update the corresponding specs. I understand that non XMPP specs might refuse to be updated, but at least this behavior could be standardized for XMPP specifically.
Yeah, the resistance is outside the XMPP community. However we have a long history of working with internet standards, and it's disappointing to now be in an era where "the internet" has become just a synonym for "the web", and so many interesting protocols and ideas get pushed aside because of the focus on browsers, the web and HTTPS.
The article literally talks about how one of the server implementations does exactly that:
> Does this affect Prosody?
> Not directly. Let’s Encrypt is not the first CA to issue server-only certificates. Many years ago, we incorporated changes into Prosody which allow server-only certificates to be used for server-to-server connections, regardless of which server started the connection. [...]
This relies on the browser, and it's not even setting relevant browser security features like CSP. It also loads scripts from multiple domains like umami.is and esm.sh, includes google fonts. It uses vercel for hosting.
So you are trusting at the very least the author, umami, esm.sh (including the authors of all of those packages) and vercel.
This is not how you code a security-sensitive app.
There also is a database, it's just indexdb/localStorage in your browser.
It's an old idea to use in browser static pages to encrypt data, this solves none of the problems with the idea, but is just glitzy AI-written marketing copy.
> A "cloud admin" bootcamp graduate can be a useful "cloud engineer"
That is not true. It takes a lot more than a bootcamp to be useful in this space, unless your definition is to copy-paste some CDK without knowing what it does.
I think the parent comment is snark. They're saying that since many Firefox users are saying "Let me turn off AI features, please!" for features they don't want at all, and few to no Firefox users are saying "Let me turn on AI features!" because few to no Firefox users want AI features in the first place, Mozilla is making AI features opt-out to "satisfy" the "want" of turning off AI features.
reply