I think many academics are often specialized in one area of their expertise and overfit in that dimension. Journalists pick this up and promote those views a bit too much. This results in non-optimal decisions due to skewed public perceptions.
We need to promote holistic thinking considering multiple dimensions and not just one where academics are proficient in.
> many academics are often specialized in one area of their expertise and overfit in that dimension
An economist saying a national-security measure costs this much is fine. Where it goes off the rails is in turning costs into damnation without accounting for what one gets in return. In an attention-driven media environment, that sells.
The problem is that there isn't simply an efficient solution for everything. At one point every problem has solutions with pros and cons
France could do it as it is a rich and big country but smaller countries do not have a viable choice. This reasoning could have been applied to France too in another universe.
It's a balance impossible to totally tilt one way or another.
So no amount of extra information could help when it's matter of opinion at the end of the day
This one caught me completely off guard when opening YouTube the first time on an iPad: Accidentally clicked on a wrong button and got stuck in a "please subscribe to premium" modal. No amount of swiping or tapping outside the popup would help, only thing left was killing the entire app.
This experience put a major dent in my perception of the "Apple has the most intuitive UI" narrative.
The YouTube app has non-standard and nonsensical UX on every platform. It's Google's fault, nothing to do with Apple.
Case in point: The YouTube app for Apple TV. Everything (pausing, playing, changing subtitles) has been done opposite to the standard player found in every other app. You cannot use the main button to pause and resume, for example. Recently they broke swiping. Normally, you swipe the remote to navigate between UI elements such as squares in a grid or in lists with a light touch. It's very fluid, like a touch screen. But the YT app has added severe "inertia" to touch gestures, and you now have to "fling" your finger on the remote trackpad to navigate. Everything feels syrupy and stuck.
YouTube and Amazon's Prime TV app are the two worst apps I've ever used on Apple TV. I believe they both use some non-native UI toolkit that doesn't respect native OS conventions and doesn't even try to. Pretty incredible given the size and budgets of these companies.
The YouTube app does the exact same thing on Android. I ran into this just yesterday on my gf's phone, as I'd just added her to my family plan, tried to verify the settings on her phone, and it trapped me on an upsell screen for YT Premium that I had to kill the app to get out of.
Maybe just the circles I run in but these are not evergreen questions in my experience. I don't even know what "go back" is supposed to mean here, or for that matter what it would mean in a Windows application. Is there a system level "go back" in WinAmp/Excel/SimCity/Photoshop I've never seen before?
In iOS, task manager and closure can't be overridden. You swipe right to return to previous application. You can swipe left for a couple of seconds if you didn't intend to do that.
You swipe up and remove the application from the stack, all processes of the application is killed.
Background processing has strict limits, and you need permissions to run longer than that, and for some use cases, there are no recourse. OS swaps you out or freezes the app.
If you want an app to work in the background, don't kill it, period. Push notifications are handled by the OS and is not hindered by this.
I'm not using Reddit in any capacity since they have started giving their content for LLM training, so I can't help you with that, but looking at 4-5 third party applications right now, they all have a left arrow at top left to go back.
They all are very different applications and have very different designs, yet the arrow is there.
To be honest, I baffled at your question for a second or so, because I never thought about that, yet the method is so universal that I was not thinking about it at all.
I feel that some people are just too old to get used to the swipe based ui. I mean friends of mine who just keep buying the only phones with (screen based) back and home buttons.
It's not just that you need to get used to gestures, it's that they are not discoverable at all, and that they can be awkward to perform with mobility issues, old hands, short fingers, etc. It's easy to make the wrong gesture, eg. the phone detects a swipe down instead of left to right, more so if you are holding it in one hand, so it's finicky and frustrating to have to rely on it as the only way of doing a common action. Why is it so wrong to have a simple navigation bar, it doesn't take up any more space than the hideous notch at the top?
I could get used to touch gestures if they were more consistent and tolerant enough for wrong inputs. It may work in one app but not another. One app expects me to swipe from left to right to go back, another wants me to swipe from top to bottom for the same thing. It may mark an email as unread if I start the swipe a pixel too far away from the screen edge. On Android swipe gestures may vary even on different phones from the same brand.
In iOS, tapping the top edge of the screen means scroll to the top. Except in the Photo app, where it means "scroll to the top of the current section, or almost the top, or do nothing and make the user guess if they just tapped the wrong way".
Meanwhile when there's an X button or arrow to the left I always know what it's going to do aside from one or two overly creative Android apps.
Is that what's going on? So many touch gestures seem to rely on landing in the right 2mm diameter area, but the minimum reliable resolution for touch seems to be a 4mm diameter circle. It's even worse for my father, even though cognitively, he would have no trouble understanding the hypothetical requirement. It's also noticeably worse during the depths of winter.
Hmm, would explain that frustrated pokey interaction I see elder people often throw at touchscreens... you know, chin up, peering down through their reading glasses, going for a 4th, 5th try at something...
It's not "getting used to", I feel like that gesture is less practical. It involves or using the "circle" to assist on how to use the gesture (creating a black void on the screen that you need to plan your use of the phone around) or having the swipe that 1) is not as reliable in my opinion and 2) can be triggered accidentally
For me is like claiming that touch screens on cars are the future and people are too old to get used to it.
Maybe saying "too old" is disrespectful indeed, what I meant was more that my kids grow up with swiping everywhere but we grew up with (hardware, then touchscreen) buttons and the older we get the harder it is to get used to new use paradigms.
Swipes are of course nice because they allow for the same interactions without taking any screen real-estate. And I have to say it quite consistent across the iOS apps I use.
I feel that some people are just forgetting what the reason it's easy for them is because they learned that "swipe based UI" ages ago.
When I get handed iPhone I have no clue on how to even open an additional tab in Safari and any finger gestures do not do the things what I expect nor there is a lick of indication on how to do something. It's all just a memorized magical incantations at this point. But hey you are familiar with them so it's easy to bash on everyone who is not in yours eco-system.
It is telling that so many of the comments here assume the person with a thing that is not the most practical would be easily able to request thing in a different format. The assumption that the person with the inconvenient thing would never have thought to ask if more convenient thing was available and just willfully toiling with the inconvenient thing is kind of insulting.
Also, in the construction industry you get an updated drawing file a day before the bidding closes... good luck getting the GC to send more detailed files (that they themselves got elsewhere) in that time. You're better off sending it to your estimation department in India and letting them work through the night to put together the new estimations.
The fees for those are still often comparatively lower to the US rates posted above. Credit cards are also not popular here, so while I do own one, I suspect average % of a merchant still remains low.
Amex also offers pretty good rates to low-volume merchants here to have more acceptance to my understanding.
The rates I posted are the full range. Because it varies yes.
You suspect average percentage is low but try to get a payment processor agreement and see within two years what you actually pay overall. It may get even above the rates I mentioned with fix costs the jeopardy to your business when a fraud does occur and the issuer blocks you from accepting any payment, or worse, accuses you of being the fraudster.
We are well educated by the financial system and VISA/Mastercard to believe this technology is for our own good. Many in the financial industry denounces their predatory practice, that of a cartel of 2 or 3 that imposed a dictate for decades. Things are finally changing, resistance will continue but you will see QR or some alternative will settle in.
I'm torn between "yes, these experinets are way too expensive and the knowlage is too niche to be really usefull" and "We said this about A LOT and we found utility in surprising ways so it could be a gamble worth taking"
That's the problem with cutting edge reaserch....you don't even know if you will ever needed it or if a trilion dollar industry is waiting for just a number to be born
Yes, we don't really know. But at some point the gamble is just too big.
Because the costs aren't just numbers. They represent hundreds or thousands of person-years of effort. You're proposing that a large number of people should spend their entire lives supporting this (either directly as scientists, or indirectly through funding it) - and maybe end up with nothing to show for it.
And there's the opportunity costs. You could fund hundreds of smaller, yet still substantial scientific efforts in many different fields for the cost of just one particle accelerator of the size we think is sufficient to yield some new observations.
It's an important distinction
reply