Hacker Newsnew | past | comments | ask | show | jobs | submit | Otterly99's commentslogin

Thanks for sharing this.

I also was under the impression that queries cost were mostly meaningless, but it seemed only is true for fresh sessions and short queries. I have to say, the result is less dramatic than I expected but still more significant for heavy users (such as myself).


Art in general is a bit weird like that.

The value of a piece is definitely not completely tied to its physical attributes, but the story around it. The story is what creates its scarcity and generates the value.

It is similar for collectible items. If I had in my possession the original costume that Michael Jackson wore in thriller, I am sure I could sell it for thousands of dollars. I can also buy a copy for less than a hundred.

Same with luxury brands. Their price is not necessarily linked to their quality, but to the status they bring and the story they tell (i.e. wearing this transforms me into somebody important).

It can seem quite silly, but I think we are all doing it to some extent. While you said that a good forgery shouldn't affect one's opinion on the object (and I agree with you), what about AI-generated content? If I made a novel painting in the style of Van Gogh, you might find it beautiful. What if I told you I just prompted it and painted it? What if I just printed it? There are levels of involvement that we are all willing to accept differently.


Have you even read the article? It is literally citing farms in Kenya where workers look at user footage and annotate videos.

It is so frustrating indeed reading about these wildly exaggerated biological claims.

The whole synthesis pipeline requires so much specific equipment and knowledge that at your kid in his/her basement would actually need a whole lab. By the way, good luck purchasing any consumable on sigma from your basement without accreditation. And I hope you have deep pockets because cell medium is expensive.


There has been a lot of research into discovering new physics (starting by discovering old physics) since the last 5-6 years and it always require:

- A lot of high-quality data - Some careful design - (Not always) some external knowledge to guide the solutions

And this is using specialized NNs for physics, where you often know underlying equations. Kind of crazy that some people are so delusional about that.


I was asking myself the same question this morning and kind of came to the same conclusion as you. It makes much more sense to design a automated drone piloting system built with a decision-making algorithm than use a LLM. For Mass surveillance, I can see a bit more the use case, where you can use classical methods to process the information about a specific person but use the LLM to generate summaries or synthesize it when you have information that is not too organized or comes from different sources. I definitely think there is some overconfidence on the side of the decision makers in the government on what these tools can be used for. Also maybe some wishful thinking on what LLMs will be able to do in a few years?

If I'm not mistaken, BERT is a classifier (enters text, outputs labels) so it is not a "Language model", as it cannot be used for text generation.

The abstract of the original BERT paper starts with these words: "We introduce a new language representation model called BERT, [...]" The paper itself contains the phrase "language model" 24 times.

It might not be considered a language model today, but it was certainly considered one when it was originally published. Or so it would seem to me. Maybe there is a semantic shift which happened here?


There is this paper that proposed data compression as a way to judge the ability of a LLM to "understand" things correctly, training on older texts and trying to predict more recent articles:

https://ar5iv.labs.arxiv.org/html//2402.00861


Do you have any good resource on explainable AI (XAI)? Most books I read on the subject tend to be a bit shallow and not very informative.

In general (not always, but it is mostly true) philantropy from billionaires and very profitable companies tend to be overshadowed by how much they profit from a system biased toward enriching them (see: The divide by Jason Hickel). A small metaphor to illustrate: are you a philantropist if you film yourself giving away 100$ to homeless people but make tens of thousands from posting the video?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: