Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

These types of takes seem pretty ignorant of the percentage of workers and students that are using artificial intelligence tools daily.

It is ignorant because it ignores any economical improvement that comes from say a machinist figuring out gcode issues, a culinary student learning safer fermentation etc...

This is pretty typical of finance or public relations people trying to make their thinking easier by only focusing on one variable at a time.

Don't know exact data, but can referr to Peter Diamantis quote:

```A survey by Microsoft and LinkedIn revealed that 85% of Gen Z uses AI at work, followed by Millennials at 78%, Gen X at 76%, and Boomers at 73%.```

Whether you want to believe thator not, you can think for yourself on how getting chatGPT to successfuly fix your espresso machine by telling which $0.35 gasket to replace... has had a positive growing booming impact on the economy or not.

To me it looks like it has and will continue to do so despite the venture money laundering/border-line-fraud issues that are clouding the picture



Is "students using AI daily" really a positive examples you want to reach for, when the primary application there is students getting ChatGPT to write their essays so they don't have to actually learn anything or think critically about the material? Inventing a way to figuratively copy paste from Wikipedia but with a much lower chance of getting caught isn't a net good just because it makes MAU numbers go up.


> students getting ChatGPT to write their essays so they don't have to actually learn anything?

You know it’s funny but my math professor in high school had the same argument when I modeled a calculus problem in my computer to arrive at the answer numerically instead of analytically.

In elementary school calculators were banned because “students arent learning anything”

Then in high school and college Wikipedia was banned because “students arent learning anything, they’re just looking up facts”

But by end of college open laptop exams were popular because doing fast research during an exam is good actually.

Is writing repetitive uninsightful essays nobody wants to read a useful skill for humans?

PS: graphing calculators were super banned in high school, even after we started being allowed calculators. I hear these days graphing calculators are required. Progress marches on


Yes, learning to write is an incredibly useful skill. Learning to communicate effectively, how to synthesize information into a coherent essay, how to organize your thoughts and put them down on paper, all essential skills. How is this even a question?


Yes and none of that is a high school essay. They’re an extremely poor teaching tool of effective communication.

As evidence I submit ChatGPT’s own waffly style of writing. It learned that from humans who on average are poor communicators. The “high school/college essay” style of writing is something new employees have to actively unlearn in their first few years in the workforce.

edit: If I was a teacher right now, I would give my students a 2000 word essay written by ChatGPT on $topic and ask them to redline the printout. This teaches them the actually useful skill of editing and fact checking since it looks like producing words has become commoditized.


Say what? That is exactly what a high school essay is. Many of them are of poor quality, because students need to learn. Learning algebra in school and getting a bunch of problems wrong doesn't mean you are not learning math. You don't learn to do anything without failing at it initially.

ChatGPT has a "waffly" style of writing because it is incapable of thinking, it's simply trying to predict the next word.


I realized we probably come from different educational environments. Maybe mine just wasn’t as effective as yours.

If you learned good/effective communications skills from high school essays, kudos! I did not, but I did enjoy the process and writing essays was one of my fav activities in school. Just that the “effective” part came way later :)


You were indeed learning to communicate more effectively. Maybe more slowly than was possible if you were in a very poor educational environment, but you were learning. And more importantly and in context: you were learning in a way that AI/LLM cannot.

Let's try to ensure that no one else has to learn in a substandard environment instead of abandoning teaching kids how to write and communicate and replace it with something that can never do it as well as a human.


"producing words has become commoditized"

It absolutely has not


It definitely has. It’s been commoditized for years actually. Just go on upwork and see the sad rates that copywriters ask for. Producing words is super commoditized.

Producing good words not so much.


I mean, I assumed that "good" was implied. Anything can be done poorly.


I’d argue that if writing is an algorithm that has been cracked, is it important to learn anymore? I have to imagine when the calculator became popular it was a similar debate. I do wonder how we should measure student understanding next


I am a professional technical writer. I find ChatGPT to be a poor writer, and a very poor technical writer. Only minimal reasoning or critical thinking is in evidence, but I find that humans cannot see the missing critical thinking when reading what ChatGPT writes. Claude is a little better, and much better stylistically. I think thinking has already fallen by the wayside in the general population; when there is writing that evidences little thinking, it is not writing that is the issue.


Is it? I seriously doubt it, and writing essays that get skimmed, graded and thrown into trash is a pointless task anyway.

People who want to cheat will always find ways to cheat. But if you genuinely want to learn something new then LLMs make it an order of magnitude easier, because they tell you exactly where to start, which resources to consider, giving you completely tailored answers to your problem or project, even if they're still oftentimes wrong. Google being increasingly useless in recent years also didn't help things.

I would not be surprised if in 20 years multimodal models are the main way to for most people to learn. There's always a lack of teachers, they're underpaid and forced to deal with an increasing amount of parent bullshit, half of them don't even have a good grasp of the topics they're teaching or just no longer give a fuck. Eventually there simply won't be enough of them and a personalized automated solution will just be better and cheaper. College professors? Probably in 10 years already and both them and the students will be happier for it, the former that can now finally focus on research and the latter for actually having a competent tutor instead of someone who has no aptitude for it but is forced to do it regardless.


> writing essays that get skimmed, graded and thrown into trash is a pointless task anyway.

Since when is writing essays and getting expert feedback pointless if you want to learn how to write?

> There's always a lack of teachers, they're underpaid

So rather than prioritizing teaching we should conduct a moonshot project to build a homeschooling technonanny that uses enough electricity to power the Eastern Seaboard so we can educate our kids without human interaction?


If you want to learn how to write, you mainly need to read more, and fixing grammar and punctuation is an automated thing nowadays. If you feel like you're getting expert writing feedback from a high school teacher then you won't be writing any novels any time soon anyway.

> we should conduct a moonshot project to build a homeschooling technonanny that uses enough electricity to power the Eastern Seaboard so we can educate our kids without human interaction

Actually, unironically yes since you only have to do pretraining once. A model that can then run inference locally on smartphones and is used for many years would amortize its creation cost though the utility it provides in a few years. It wouldn't even have to be AGI tier, just good enough to work with.

A single teacher educating 30 people vastly underperforms 1-1 tutoring. It's a "good enough" system and it doesn't work very well for most people, it's just the only thing we have. Hell most of it is just following a fixed script that repeats every year, answering a few questions and grading based on ground truth.

A personalized teaching system that knows your interests and skills would be able to motivate and explain far more effectively by adapting the script to you in pace and difficulty. Being available to you 24/7 for questions and having a personality you like would certainly help too. There's this 11 year old video [0] from the ex-teacher CGP Grey that turned out to be pretty prescient and elaborates a bit more on the general idea.

All of this could even be done sustainably, running training during peak solar hours, excess wind power and the like, but as long as there's demand it doesn't make sense to leave your expensive and rapidly depreciating GPUs idle. And there's demand because people have this misconceived notion that AGI is somehow winner take all, making it a race where everyone's flailing about like a headless chicken. In reality whoever gets there first will have a short first mover advantage for half a year and then everyone else will replicate their work, and probably an open source version a year and a half later tops.

[0] https://www.youtube.com/watch?v=7vsCAM17O-M


This is a perfect example of how AI hype breaks people's brains. It's a Bond villain scheme, not a way to improve public education. People learn best from other people and always have. Computers in the classroom have always been a net negative.


What do we count as "using AI"? How much AI do you have to use for work to be considered "using AI"? I used AI once for work 1 year ago, when it was popular to try, does that count?

I remember when Alexa was considered "AI" OR when alphago was considered AI OR when any algorithm was considered "AI". But these days, we typically mean "LLMs" .

So any claim without data seems pretty unlikely to be true unless they cite sources, and even then I would want to see methodology.


"a culinary student learning safer fermentation"

What could possibly go wrong?


Well reasoned, but citing a Microsoft survey is not informative.


> Gen X at 76%, and Boomers at 73%

Ok now those are some damn surprising numbers. At least as per the boomer what's-a-computer stereotype.


> which $0.35 gasket to replace

I’m sorry, but can it actually do that??


If some site out there has it, it can tell you.

Just like Google should, but haven't done for a decade.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: