Honest question: what are the benefits of using environment variables over having an actual configuration file (that is obviously not added to version control) ?
Unless your environment demands it, it doesn't matter. In fact, it can be a bit of a pain in the ass to implement on your own, if you are not using Heroku or some such. The main point there is to not put secrets into your git repo. How you accomplish that for the most part doesn't matter.
The simple "json.loads" solution would return a list of strings instead. What's the Python code for turning it into a set of enumeration values, and failing if one of the values does not match ?
But that's not a problem with JSON-the-serialization; that just means JSON begs to be extended with a rich schema definition language.
It's kind of like saying "language X sucks because it doesn't have an automatic build tool; use language Y instead". You just need a build tool for X, not a new language.
Easy language interoperability as a reason to choose Protobuf over JSON ? Mainstream languages support both JSON and Protobuf equally well, and the others tend to support JSON more often than Protobuf.
Free backwards compatibility ? No. Numbered fields are a good thing, but they only help in the narrow situation where your "breaking change" consists in adding a new, optional piece of data (a situation that JSON handles as well). New required fields ? New representation of old data ? You'll need to write code to handle these cases anyway.
As for the other points, they are a matter of libraries (things that the Protobuf gems support and the JSON gems don't) instead of protocol --- the OCaml-JSON parser I use certainly has benefits #1 (schemas), #3 (less boilerplate) and #4 (validation) from the article.
There is, of course, the matter of bandwidth. I personally believe there are few cases where it is worth sacrificing human-readability over, especially for HTTP-based APIs, and especially for those that are accessed from a browser.
I would recommend gzipped msgpack as an alternative to JSON if reducing the memory footprint is what you want: encoding JSON as msgpack is trivial by design.
Many companies offer production-grade hosting for open source applications. The fact that you are keeping your source closed tells me that you are afraid a competitor (or your clients themselves) could achieve an equivalent quality of hosting for less than they would pay you. To a customer like me, this is bad news.
We absolutely think that open sourcing code is important, but not just a giant lump of all of it; a mountain of code not always useful.
It's still early days for us but we are considering open sourcing reusable parts of our platform or allowing people to host their own if they wish, but we haven't yet decided on which we will pursue yet.
By firewall, he would have to be talking about a client side firewall running on the machine making the request. Something that sees the request before it actually goes out on the wire.
At RunOrg, we've encountered this a few times with our CORS-only API.
It is against our philosophy to leave users behind only because they are locked in by outdated infrastructure. We still want to support them.
It is against our philosophy to bend the purity of our API to accommodate wrinkles in how outdated infrastructure declines to support standards. There will be no JSONP alternative to CORS in RunOrg.
The solution we propose each and every time is to mount a proxy to our API on the same domain as the site it is used on. Users on modern infrastructure reach us directly at api.runorg.com, users on CORS-hostile infrastructure reach us through the proxy and still get their data (albeit with decreased performance). It's a fairly simple technical solution that leavers our API clean and supports non-CORS modes of access.
When tested on 800 data points, a purely random strategy would have a > 53.5% success rate about 2% of the time.