In 1995 how many people used the internet in their daily work, of those that did how many was it a curiosity that maybe supplemented their existing business practice (sending a memo via email rather than post for example). Large companies were using large computer mainframes but the majority of employers - the SMEs - weren’t.
By 2005 it massively shifted, and AI seems to be coming faster than the internet and computers in general.
By 2015 non intenet companies were going the way of the dodo. How many travel agents were there per 100k in 1995 compared to 2015?
Also add in that these adoption rates are being enforced via threats of firing by bosses of workers. It's hardly something organic, there's a reason why the LLM companies are chasing lucrative corporate welfare contracts because consumers have soundly rejected this nonsense.
I'm not sure what you'd call a "pioneering scientific advancement", but there is an increasing amount of examples showing that LLMs can be used for research (with agents, particularly). A survey about this was published a few months ago: https://aclanthology.org/2025.emnlp-main.895.pdf
What we also learned after GPT-3.5 is that, to circumvent the need for new training data, we could simply resort to existing LLMs to generate new, synthetic data. I would not be surprised if the em dash is the product of synthetically generated data (perhaps forced to be present in this data) used for the training of newer models.
To add on what's been said already on slide decks, another great slide creation package in Typst is touying[1]. I've used it to create my own academic theme[2] for courses or conference presentations.
Source?
reply