In practice, you end up with apps which dynamically build up predicates in different sections of the code. And when you combine that with many predicates, many tables, and other constraints such as ordering or aggregates, things get complex pretty quick.
Even if you have the best developers who understand all the in an outs of the dataset, re-implementing an optimizer in the app is rarely the right choice.
no, it's categorically not good to make an application more fragile. A weakness restated is not a strength. Every database should have an optimizer, period.
If a database system does not support joins, aggregations or subqueries like most realtime NoSQL solutions do, an optimizer becomes pretty trivial. Optimizers are needed for analytical stuff. That's why most optimizers are evaluated on analytic workloads (e.g. TPC-H, TPC-W) not transactional / realtime (TPC-C).
I did not state that an optimizer should not exist for a database - I think thats key actually - but rather that the tradeoff they made this time around was fundamentally good in that - at least for now - it forces the developer to think about application performance.
If that happens to make an application more fragile, I think that is more of a code organization/tooling issue than anything else.
You don't have to be a dba to understand how using one index vs another will affect the performance - and thus conversion rate - of your application.