That seems at odds with TFA itself, which claims "Ford itself recently announced that it was giving up on its goal of developing full self-driving technology, at a cost of $2.7 billion."
Perhaps the standard for "reduction to practice [0]" is a little looser than I'm imagining. I can't even see how this would be covered by sufficiency of disclosure[1] since the apparent non-existence of level 4 or 5 autonomy suggests that no person skilled in the art[2] yet exists.
Yes, but it doesn't mean you have to actually have built anything or even be able to build anything. The filing of the patent application is considered a constructive reduction to practice:
> Reduction to practice may be an actual reduction or a constructive reduction to practice which occurs when a patent application on the claimed invention is filed. The filing of a patent application serves as conception and constructive reduction to practice of the subject matter described in the application. Thus the inventor need not provide evidence of either conception or actual reduction to practice when relying on the content of the patent application.
Eh, I think it’s somewhat of a big deal because (to stick with your analogy), just like football, it can be all consuming if left unchecked. I’d argue it’s because the anti-Apple zealots are much louder, but even if one disagrees, the result is the same: the comment thread for pretty much every HN post even tangentially involving Apple becomes a platform war.
Like you, I prefer Cupertino’s way of doing things and would like to have measured conversations about the same without the BS diatribes from the haters. Impossible to do around here and has been for years.
Many of those who have not learned from history are so anti-Apple (or possibly subpar webdevs) that they completely ignore the lessons we've previously learned about why browser monoculture is dangerous.
Even more worrisome, these people often ignorantly call Safari "the new IE", meaning they're aware of the history and problems and choose to pursue their own broken interpretation.
If these people will ignore a browser with 50% market share on mobile and 20% overall due to their own shortsightedness, clearly they're going to ignore Firefox or others hanging out in the single digits.
> Many of those who have not learned from history are so anti-Apple (or possibly subpar webdevs) that they completely ignore the lessons we've previously learned about why browser monoculture is dangerous.
I'm confused, because to me it seems that the pro-Apple folks are the ones ignoring the lessons from large corporations using their weight to force monocultures.
Firefox is the only meaningful browser that is open and won't be leveraged by its steward to promote their business interests.
Back fifteen years ago IE held back the web because web developers had to cater to its outdated technology stack. “Best viewed with IE” and all that. But do you ever see a “Best viewed with Safari” notice? No, you don’t. Another browser takes that special place in web developers’ hearts and minds.
Oh you're absolutely right about Chrome, I'm just not sure why you mention 'anti-Apple', because Apple's leverage is being used in many of the same ways, just much more aggressively than 'best viewed in IE', instead it's 'App Store/WebKit/<choose your monoculture> or pound sand'.
Said nothing about "anti-Apple". I'm just agreeing with the poster above saying that people being vehemently anti-Apple actually haven't learned anything from history. At all.
> Apple's leverage is being used in many of the same ways, just much more aggressively than 'best viewed in IE'
Of course this is bullshit. Again. There's probably not a single site out there that is "best viewed in Safari". And there are numerous sites that are "best viewed in Chrome". Including, especially, the ones that Google themselves (#1 search, #1 mail, #1 video hosting, #1 web ad business in the world) creates.
And to quote again:
--- start quote ---
Regardless of where you feel the web should be on this spectrum between Google and Apple, there is a fundamental difference between the two.
We have the tools and procedures to manage Safari’s disinterest. They’re essentially the same as the ones we deployed against Microsoft back in the day — though a fundamental difference is that Microsoft was willing to talk while Apple remains its old haughty self, and its “devrels” aren’t actually allowed to do devrelly things such as managing relations with web developers. (Don’t blame them, by the way. If something would ever change they’re going to be our most valuable internal allies — just as the IE team was back in the day.)
On the other hand, we have no process for countering Google’s reverse embrace, extend, and extinguish strategy, since a section of web devs will be enthusiastic about whatever the newest API is. Also, Google devrels talk. And talk. And talk. And provide gigs of data that are hard to make sense of. And refer to their proprietary algorithms that “clearly” show X is in the best interest of the web — and don’t ask questions! And make everything so fucking complicated that we eventually give up and give in.
--- end quote ---
Google releases 400 new APIs a year with little to no oversight and with complete disregard of any objections or concerns from the other browser vendors: https://web-confluence.appspot.com/#!/confluence
The things that you think Safari is lacking in are largely Chrome-only non-standards.
My comments in this thread are almost exclusively about the odd assertion in my parent that somehow 'anti-Apple' folks are the ones who have ignored history's lessons about monocultures. I'm not presenting this as an Apple <-> Google dichotomy; in fact nearly the opposite, both companies are fighting for monocultures that they control, just in slightly different domains. Apple wants to control the client platform, Google wants to control the web. Neither is good for users. It's very odd to me that someone would frame this discussion as 'anti-Apple' people missing the point. I won't speak for others, but I, as an anti-Apple person, am absolutely vehemently against this return to 'best viewed in IE', but I am also opposed to operating system developers and hardware vendors dictating what users are able to do with their own shit and insisting on putting their grubby paws on every dollar that passes through.
That is why I use Firefox, as the only remaining browser that hasn't shown a long-term pattern of curtailing user freedoms or rights when it suits them. I don't see Safari as a solution here; Apple is not pushing for an open web because it is righteous, they are pushing for a platform they control and to hurt their competitor. They are not to be trusted either. If they can, they will absolutely leverage that control against the user as they have shown time and time again that they are more than willing to do.
> Of course this is bullshit. Again. There's probably not a single site out there that is "best viewed in Safari". And there are numerous sites that are "best viewed in Chrome". Including, especially, the ones that Google themselves (#1 search, #1 mail, #1 video hosting, #1 web ad business in the world) creates.
When I say 'Apple', I mean 'Apple', not 'Safari'. Apple are the ones with a platform that will not run unblessed code. Apple are the ones that don't let developers or users choose how software is distributed. Apple are the ones that tell you which APIs you can and cannot use, and what your app can and cannot do. Apple are the ones that tell you what browser engine you can run, which is much stronger than a website saying 'yeah we tested this against IE, but go nuts', instead it is Apple saying 'if you want a browser engine, you can take Webkit or pound sand'. This is Apple's modus operandi, writ large. At least with Google's level of control you can still do what whatever you want with the website that runs in Chrome.
Ugh, this is a classic example of skewed logic going way too far before the underlying truth can catch up.
I am in tech management (also a recovering attorney) who routinely conducts interviews, and it is perfectly acceptable to ask where someone is from in a professional setting.
The unacceptable part, as OP at least hints at, is using the response as a proxy for some other verboten criteria or perhaps to kickoff an overly intrusive line of questioning.
These behaviors are odious on their own and why HR should be explicit in training against antipatterns, not spreading meaningless FUD which miss the point and permit bad habits to foster elsewhere.
It’s really inane that modern software recruiting claims to focus on getting a proper picture of the whole candidate , yet untrained interviewers counterfeit the whole endeavor thinking they’re politically correct because they’re afraid to ask anything but the same broken whiteboard questions.
If you don't collect the data in the first place, you can't misuse it - and it's much easier to prove that you didn't misuse it, because all you have to show is that you never had the dangerous data in the first place. This is the same advice we give about handling PII in applications.
Yeah, the author is definitely racially problematic: all her references to Indian/Asian students claim their success is due to merit, hard work and the like (“diligence and ambition”) whereas Hispanic and Black students aren’t suited for “rigorous training in Maths and Science” and are doomed to be dropouts.
One only need look at the source publication (a magazine of stories deemed “unheard” by a British oligarch). Skimming Wikipedia, this bit made me laugh: “a magazine publishing people who are generally unheard because people edge away from them at parties." Exactly the case here, except the author felt her oh so privileged voice was silenced because she wasn’t heard at a school boarding meeting…2800 miles away from where she lives. Also, if 70% of the school shares characteristics with you, maybe introducing some diversity isn’t the worst thing in the world.
Anecdotally, as a former internship director, I’ve taught many of the students from high schools named in these articles. Ultimately, every student is unique and needs to be considered on a case by case basis. But many of the students who struggle are those with helicopter parents like the author. Through early hard work and ambition they have book smarts or can study, but have never encountered any real setbacks and expect everything to just go their way, which obviously doesn’t happen in the real world. It’s those students who have to struggle, the one the author otherizes in her article and doesn’t want to see in the same school as her kids, that go the farthest.
> ...whereas Hispanic and Black students aren’t suited for “rigorous training in Maths and Science” and are doomed to be dropouts.
Author says the very opposite: "It also insults and demeans the achievements of black and Hispanic students who get into schools like Lowell and TJ through sacrifice and achievement. The message is clear: stop trying so hard." Where "It" is the new race-biased admissions policy. Please do not make misleading and outrageous claims.
> I wonder if that would run into net neutrality-type issues
It absolutely would. Who is Apple or anyone on HN to tell one developer that their app is more important/costly/uses higher priority SDKs than another?
Well, maybe app binary sizes are an objective metric. Apple could even raise the existing cap and give Uber a bit more room to breathe, while charging them for the inevitable expansion of their app.
I think the analogy would be more equivalent to AWS. There are different costs associated with using the various SDKs (i.e. Lightsail, EC2, S3, SageMaker).
I'd also say the more it can be defined in terms of technical reasons the better.
There are R&D costs to maintaining a cutting edge, industry leading 3D Graphics SDK for iOS that the primary users beneficiaries of said library (i.e. Epic, Xbox) should contribute to.
This would require a bit of mindshift and context.
> The main reason flash was killed by Apple was also spite toward Adobe
Prior to and subsequent to Flash (a Macromedia product, fwiw) Apple & Adobe have had extensive relationships and interaction that are in no way indicative of spite.
The main reason Flash died is because performance & power consumption on mobile never got past garbage, despite the platform wars going borderline nuclear on the topic.
> they made valid excuses sure,
Perhaps it's my becoming an ol' fogey, but it's crazy to see the Apple haters of yesteryear also move toward historical revisionism.
While the best approach is to just ignore such FUD like "valid excuses, sure", it's also important the record not be owned by zealots and those who preach spite.
Reduction to practice is absolutely a requirement of patent law in the US.