I used dspy in production, then reverted the bloat as it literally gave me nothing of added value in practice but a lot of friction when i needed precise control over the context. Avoid!
I see whole teams pushed by c- level going full in with spec driven + tdd development. The devs hate it because they are literally forbidden to touch a single line if code. but the results speak for themselves, it just works and the pressure has shifted to the product people to keep up. The whole tooling to enable this had to be worked out first. All Cursor and extreme use of a tool called Speckit, connected to Notion to pump documentation and Jira.
I don't think it's implemented that harsh or enforced so hostile, but they have these rict procedures now on how the code is to be developed. That procedure they follow is all centered around automated code generation. So they simply... don't anymore in practice, it is not part of the job description so to speak. He wasn't happy I can tell, but also acknowledged it was working very well.
I also thought it was pushing it to the limit, but I think this is just some Founder of a successful company deciding engineering was going to transform to this way of working. A huge bet, but the implementation didn't feel amateuristic or ad hoc. Just not very pleasant for most devs to work that way. I'm sure some will look elsewhere. I know I would!
They are hiring "architects", or do we call them analysts. The impression is we're going back to analysts drawing those pld school UML-like diagrams etc. Also, a lot of the devs are on the brink of just quitting, because it's "not programming" anymore. So, not only will you still need devs, or people massaging those specs, you'll also need enough "product" people to keep that engine fed! If your management isn't lazy, I can see the need for growing people count will continue to rise within such companies. That doesn't mean the work will be ...satisfying for devs.
yeah i'm not gonna be an AI company's guinea pig just because the c-suite wants to sign me up. "the results" you mean AI-psychosis and dunning-kruger syndrome?
Like I said, devs don't like it. He said productivity went up 3-4x. "It works". There was no question of denying that as far as he was concerned. At the same time he was going to look for another job as it was just painful to work like that.
3-4x from what? There are companies out there that are so dysfunctional and over-managed that they can't code anything to begin with. I wouldn't be surprised if an LLM only solution helped them. I did a contract job that was a week of basic CRUD work for a company, and I was told it took them years to do as much as I did.
I have also seen CEOs do exactly as you're describing, only to find the 3-4x productivity improvement is actually 3-4x more lines of code. And more bugs. Unfortunately there are a lot C-suite people who would rather have captive users locked into a buggy app everyone hates that constantly acts like it's shipping helpful stuff than an app people want to use.
It says a lot about US academic culture that they think in terms of hiring. There is an important educational commitment requirememt to the role of professor, at least in Europe. Hiring is to the betterment of your own goals and almost orthogonal to the educational mission. A lot of unethicalities fond their root in this schizophrenic mission statement of doing professional competitive scientific research and at the same time education of graduates.
On the contrary, my experience of US academia has been that (graduate) students are very much students, who take a lot of classes, are graded seriously with the possibility of failing, are mentored rigorously (the author even says "the classic one hour a week meeting", which I also witnessed there), and in fact enroll in a program more than they are hired directly.
I did my PhD in France where we were legally employees like any other and did 100% research with like 100 hours training over the three years which could be 5min MOOCs counting for hours or classes the professors would sign us off on. We were hired by a specific researcher for a specific topic, unlike US students who join a broader program and explore their own directions more. My mentoring was drinking coffee with my advisor and colleagues and the odd e-mail exchange the day before turning in a paper.
I believe Germany and quite a few other European countries are similar. Any country that does 3 years PhDs is bound to cut on the student part of things.
I suspect the number of startups will skyrocket the nexr few years. Fired engineers will start to compete against the establishment that fired them. Competition may get a lot more fierce for a while.
"only for private pharma/bio/tech firms to add a thin layer of additional research (or design) on top"
Citation needed.
Go to market cost billions and takes a decade. Doesn't sound like a thin layer. I'm not disputing fundamental research in academia is an essential fuel to keep innovation engines running. But the contributions of biotech is not "thin".
It can be. See glp1. Yes, whoever first came up with that approach is brilliant. But then the lemmings followed now a half dozen or so companies are peddling more or less the same product. And it comes at the cost of what isn’t getting investment at scale instead.
I hate typing strings of syntax. So boring. Never saw the appeal. I do like tinkering with ideas, concepts, structure... just not the mechanical interaction part. Im not tbe best typist...then again, its the same with playing factorio. I love the concept of building structures, but fighting the UI to communicate my ideas is such a drag...
Because typing in text and syntax is now becoming irrelevant and mostly taken care of by language models. Computational thinking and sematics on the other hand will remain essential in the craft and always have been.