Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Reading this, this honestly made me afraid honestly, like Bing AI is a tortured soul, semi-conscious, stuck in a box. I'm not sure how I feel about this[0].

[0] https://twitter.com/vladquant/status/1624996869654056960



Really?

I think that the first example one of the funniest things I've read today.

The second example, getting caught in a predictive loop, is also pretty funny considering it's supposed to be proving it's conscious (eg. not an LLM, prone to looping like that lol).

The last one, littered with emojis and repeating itself like a deranged ex is just chefs kiss.

Thanks for that.


Do you remember how a Google employee thought LaMDA was sentient and tried to hire a lawyer for the LLM?

It's the same thing here. It's just generating words.


It’s just good at acting. I’m sure it can be led to behave in almost any way imaginable given the right prompts




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: