Hacker Newsnew | past | comments | ask | show | jobs | submit | grenadier21's commentslogin

Seconding CloudFlare's private CA. It's simpler and arguably more secure (less people to talk to, fewer moving parts, and most importantly, no need for backward compatibility), so there's really no excuse.


It's not that, it's the fact that Google asks you almost never if you're using a Google account. That pretty much gives Google-account-havers a fast pass to the Internet, which gives Google even more power that it really shouldn't have. Also, they're using the captchas to train their AIs, which is unfair extraction of labor.


I wonder what's going to happen to all the incredible Flash media, especially games, produced over the years. I hope someone's archiving them, because I'd hate for us to lose them.


This should be a solemn reminder about proprietary (especially binary) file formats. They serve a purpose but are an awful way to archive things of value.

I truly hope that we have learned our lesson from Flash and choose formats like SVG (human readable and an open standard) from now on for things like animations and games.


> I truly hope that we have learned our lesson from Flash and choose formats like SVG (human readable and an open standard) from now on for things like animations and games.

At least with a flash game, you can often just download the .swf file and run it however you like.

With HTML-based stuff, you're often reliant on a server being up. Authoring tools are also just not what they were with flash - the sheer number of quality vector-based games and animations has dropped like a rock. I'm not sure if this is something Adobe will resolve eventually, but at the moment it's just kinda sad.


Don't forget rendering engines. Even if you download everything you need, there's a chance the game will look weird (or won't work at all) because the developers only tried one browser.

For all its faults, whether a Flash game worked or not came down to "did you install the plugin?".


Not true. Browsers work with file addresses on your system. You can download an HTML file and all its JS and CSS and if navigate to the path of the HTML on the disk, the website would just behave as if it were served by a static server.


While true for simple things, this is one of those things that’s highly variable depending upon how you built your app. If all your JS is included via script tags and all your image assets are referenced directly, sure, probably.

When anything is dynamically generated things get iffy, though. It’s one of the big problems both Google and services like the Wayback Machine and Pinboard have had with searching and preserving content.

Of course gaming as a whole has also evolved, so it’s less likely you’d ever get a purely single player game in the first place, and who knows if the game will even load in 10 years if it can’t connect to the server to see if you’re registered or not.


> Browsers work with file addresses on your system.

Not for AJAX requests they don't!


If that were the case, 'The Web' wouldn't be pushed like it is right now. 'The Web' is a way to take control from users and put it into the hand of the ones that control the servers. People work on fully fledged binary programs compiled to 'The Web' that your computer just executes while the other half happens on the server. That's as proprietary and inaccessible as it gets.


It's a great reminder that Flash at the time was way beyond any other technology for the designers and animators who were the primary users of it.

And the fact is that most of the open technologies take years in order to become as full-featured as the closed-source ones. By that time the trend that artists were following is out of fashion.

By the way - if we think about it. Are there yet any actual tools that allow you to program SVGs as easily as you could once program Flash? I understand it's technically possible to compile from Flash sources to SVG, but are there actual tools yet?


> any actual tools that allow you to program SVGs

There's Synfig Studio that's open source and supports SVG export: https://www.synfig.org/


Even if you use open formats, if it's a web app it's not unlikely that it has a server side dependency. For single user flash apps it's uncommon not to be self-contained. So actually it's a requirement as well that the source code and the assets are open.


I used to do some Flash Game Developmemt. My gut tells me that most games had game assets that are self contained. It's a lot easier to author an SWF that way (from my own experience).


It's a reminder that browsers are such huge beasts that no one can viable fork them when they feel the browsers are regressing.


Then use a succession of non-viable forks that are re-based periodically from the main but with most of the crap choices removed.


This. Works in proprietary formats will be lost to time after supporting vendors lose in the marketplace.

At best, there will be crippled, lossy exports -- lossy because vendors chasing lock-in and network effects don't want to make it easy for customers to leave.

I've cited the demise of Opcode and Studio Vision in the past when making this argument, but that's niche. The Death of Flash brings down the point with greater weight.


There was a time when the web was not able to handle SVG and Flash was the only option for rich interactive websites.


They'll still be downloadable and playable in a standalone player.

Also it would technically be possible for someone to write a WebAssembly flash player. Although I don't know if anyone will feel particularly motivated to.


It's always been possible to have free flash players. Gnash is one example. But they've never been feature complete or particularly fast. Maybe if Adobe opened their player things would improve.


The advantage now that flash is no longer being developed is that a player could concentrate on supporting the features used by the most popular content. Like dosbox it might not perfectly replicate the environment (at least to start with) but it could run the games everybody wants.


Adobe formats are notorious for bloat and scope creep. Implementing a conformant PDF reader would require you to implement a full 3D engine.


And implement Flash.

But that's hardly unique to Adobe. Any document format that allows (arbitrary) other things to be embedded is subject to that. A Word document with other things embedded in it is no different. (Just like correctly rendering or editing an OpenDocument file would require you to implement half of SVG and MathML – while they're open specifications it doesn't necessarily mean you have an implementation at hand you can use.)


People are already working on it. You can check it out here: https://news.ycombinator.com/item?id=17629173


Case in point: I tried to run the orisinal games, like Winterbells, for my daughter and found that they no longer work on Firefox. I was heartbroken - some of these flash games are art and should be archived.


You can just download Chrome, it has Flash built-in. Until they kill it in Chrome as well, of course.


Flashpoint[1] claims to have archived over 10,000 games for Flash and other web plugins.

[1] http://bluemaxima.org/flashpoint/


Maybe someone could create a WASM implementation of Flash? I wonder if we could convince Adobe to open source Flash player after they finally kill it for good.


At one point Mozilla was working on a flash reimplementation - http://mozilla.github.io/shumway/ - not sure of the current status but it's a good place to start looking.


I'm pretty sure shumway is dead judging from bugzilla and lack of activity in that repo

https://bugzilla.mozilla.org/describecomponents.cgi?product=...


> I hope someone's archiving them

Why not do it yourself? Archive your favorite games. Datahoarding is a fun hobby that can help others.


there are a lot of tools that will convert flash to html5, although I don't know if any of them support dealing with user input.


no script support that makes you wait eight seconds in a technology to make things ×faster× isn't really support, is it?


Agree. To put it another way in hope of driving this home:

It is sad when the brightest minds in the world decide that the fix for "flash of unstyled content" is to show no content at all for close to 10 seconds.

Seriously: 8 seconds mandatory waiting would have been considered slow even 10 years ago and the only reason it passes now is because Google got a stranglehold on most of the web.

Edit: improve last paragraph


If you turn off all scripts AMP has a <noscript> block that disables the 8s timeout. The OP is blocking only external scripts, which not surprisingly looks a lot like a very bad network connection.

(Disclosure: I work at Google on making ads be AMP)


Why 8s though? That’s well into “give up on loading this page, close the tab and try somewhere else” territory.

Perhaps a better option is finding ways to prevent content jumping around so much while assets are loading.


> Why 8s though? That’s well into “give up on loading this page, close the tab and try somewhere else” territory.

That 8s timeout is for loading the AMP JS from the CDN. You want a time limit that separates "you're on a slow connection, keep waiting" and "just give up, it's not worth it". I suspect it was set by looking at network graphs, but I don't know.

What the OP is doing, blocking JS and also ignoring <noscript>, is bizarre, and something you should expect to break sites.

> Perhaps a better option is finding ways to prevent content jumping around so much while assets are loading.

AMP does that very well, but only by taking control of the process of rendering, which requires JS.

(Disclosure: I work on making ads be AMP.)


No, the page loads perfectly fine if Javascript disabled. It's only if you go out of your way to break the page in the most difficult way possible that the script will fallback to having a delay.


Like parent said, if a page takes 8s to load, I sure as heck don't mind seeing some partial results so I can start reading in those 8s.


No it's not. All the AMP websites I've seen were just as bad as their non-AMP counterparts. Reddit's AMP pages in particular, are slow and immensely frustrating to use.


But how do you know which 1 or 2% of the time it's happening?


If you don't notice that the article is missing some paragraphs, then the article has had a bad editor.


I mostly read random people's programming related blog posts. Knowing whether the article just is kind of bad at explaining what it's trying to say, or if important parts of it is missing, is really not obvious.

If paragraphs of text go missing, maybe you notice it instantly, but if it's images or code blocks which aren't explicitly mentioned in the body text of the article and just let exist to contextualize or demonstrate what the author is talking about, how will you know that something is missing? More importantly, is it even possible to develop a heuristic which catches almost all cases where parts are missing, without lots of false positives for cases which are just inexperienced authors writing bad articles?


blogs aren't usually amp, i've never seen reader mode screw up wordpress or anything. i'll concede it's probably not ideal for code blocks though.


Could some rebel make things like fake road bumps that lidars can't pick up, but humans can?


Protecting AI against spoofing attacks is a pretty active area of research right now.


Protecting neural networks from adversarial attacks, i think you mean. Protecting good A.I. from spoofing attacks is exactly the same as protecting human drivers from spoofing attacks.


Or the other way around - trick the lidar into thinking a road bump is there when there is no bump.


That's how literally every VDOM-based framework looks, including React. Elm isn't special in this regard. It is special in many other regards, however, and I appreciate it for that. It's also interesting as one of the few languages that have parametric polymorphism but no ad-hoc polymorphism, which leads to some interesting designs. I don't like how only the BDFL is allowed to do certain things, like create new operators in external libraries, but overall, it's a very pleasant experience.


"I don't like how only the BDFL is allowed to do certain things, like create new operators in external libraries"

Isn't this true for like, 95% of languages? Yes, Elm breaks from it's Haskell heritage in this regard, but this is a very common lack of feature.


Fair enough, but my main problem with it was that it existed, and then they removed it, which was very sad.


"one of the few languages that have parametric polymorphism but no ad-hoc polymorphism"

I wonder what others are? Would like to try them.


OCaml and Standard ML are two others, and I'd say OCaml is established enough that Elm is not really breaching new ground here. The main difference is that OCaml has other features you tend to use instead, particularly the module system.


OCaml is a very fun language in my experience. Apparently they were going to add ad-hoc polymorphism in the form of modular implicits, but I have no idea what happened to that.


It's due around the time OCaml multicore support and Half Life 3 are released.


Can't the module system be used for ad hoc polymorphism?


Partial evaluation of the viable strikes me as a very elegant idea! That way static dispatch just becomes a special case of virtual dispatch. It also reminds me of devirtualization optimization, and tangentially, how Rust does the opposite thing, by making boxed traits implement the traits, too, which makes dynamic dispatch a special case of static dispatch.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: