Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The post said 450 million pageviews, likely since November. If we make very generous assumptions and assume that each pageview is a megabyte (very generous based on my own experience scanning billions of websites), then that's 450TB total in traffic. If you really did 450TB per month in traffic, you would need slightly more than one gigabit line (and hence VPS), but not more than two. With Hetzner the traffic would cost you €450 or $535.

Did I get something wrong?

 help



Well, https://jmail.world/jacebook-logo.png is 670KB by itself and loaded on initial load, so I think they might have blown your suggested traffic budget and still have some optimization to do.

How is that image 670 KB!? Definitely some optimization low-hanging fruit there.

Edit: dang, even pngcrush can't get it below 580 KB. Disappointing performance on PNG's part.


Because inexplicably, there's random pixel-level noise baked into the blue area. You can't see it unless you crank up contrast, but it makes the bitmap hard to compress losslessly. If you remove it using threshold blur, it doesn't change the appearance at all, but the size is down to 100 kB. Scale it down to a more reasonable size and you're down to 50 kB.

Modern web development never ceases to amaze me.


None of this is due to "modern web development". It's just about a dev not checking reasonable asset size before deploying/compiling, that has happened in web, game-dev, desktop apps, server containers, etc. etc.

This should be an SVG (a few kb after proper compression) or if properly made as a PNG it'd probably be in 20-ish kb.


The dev not having the common sense to check file size and apparently not realising that the PNG format was being grossly misused for this purpose (by not even having a single tone of white for the J and the corners, let alone for the blue background) is modern web development.

Right, so you mean that this is unique and inherent to web dev and specifically modern web dev.

What is that noise actually? It's clearly not JPEG artifacts. Is it dithering from converting from a higher bitdepth source? There do appear to be very subtle gradients.

I would bet it's from AI upscaling. The dark edges around high contrast borders, plus the pronounced and slightly off-colour antialised edges (especially visible on the right side of the J) remind me of upscaling models.

Not even the white is pure. There are at least #FFFFFD, #FFFFFB and #FEFEFE pixels sprinkled all over the #FFFFFF.

If it's large enough for say 2x or more "retina" usage... a very minor soften filter, then color reduction to get well under 256 colors (often can do 32-64) quantization and oxipng w/ zopfli can go a long way... just getting down to a palette over rgb for png brings down sizes a lot... palette reduction to around 32 colors does a bit too. Just depends on the material.

That said, the actual size of this image is massive compared to where it needs to be, and looking at it, should definitely be able to quantize down to 32-64 colors and reduce the size to even 4x the render size... let alone just using svg, as others have mentioned.


I'd bet that it's AI generated, resulting in the funky noise.

Oh, ding ding! Opening in a hex editor, there's the string "Added imperceptible SynthID watermark" in an iTXt chunk. SynthID is apparently a watermark Google attaches to its AI-generated content. This is almost certainly the noise.

Make it an SVG and it's down to 1kb.

I got it down to 200KB using normal png encoder and just limiting number of colors, but it should be replaced with tiny SVG.

Fair enough, I just loaded some pages and some of them are even bigger than 2MB. But then again those static resources would be cached client-side. So unless you have 450 million unique visitors who only ever go to one URL on your site, you are looking at significantly less per pageview. I reloaded the frontpage with caching enabled and it was ~ 30kB of data transfer.

For high traffic check places like datapacket (no affiliation), or slightly cheaper, places like onlyservers (no paid affiliation, was a customer) or even find a transit provider and a colo and a server yourself. For $535 a month or less you can get a pretty good amount of bandwidth.

Isn’t part of Vercel’s value proposition a robust global CDN in front? Seems quite a bit different than one sweaty VM in Helsinki.

Genuine question: How is that a value proposition when Cloudflare offers a CDN for free with unlimited bandwidth, that you could just put in front of the sweaty VM in Helsinki?

Not trying to be obtuse, I really don't get how other providers can compete with that, I can't imagine Vercel's CDN is so significantly superior to make it worth it.


For that matter, the entire site could be in a Cloudflare worker with all the content in R2 (no egress fees, just storage). Likely to barely exceed the $5 baseline price for a paid account. (not sure on the storage costs, but likely well under $100/mo)

Yes, and I didn't mean to imply that a single VPS is all you needed. But I wanted to put things into perspective for the other posters who claimed that you couldn't possibly serve a site like this from a single machine, purely in terms of performance.

Some people don't realize how big machines get. A single ordinary server can have a 4x100Gbps connection and 256 physical CPU cores.

That's not worth 45k. It's barely worth anything for a typical website, tbh.

well each view of an 'epstien file' is a pdf with images embeded so i think your 1mb might be not so generous.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: