I wholeheartedly agree. Computing professions such as software engineering used to feel like, "Wow, they're paying me to do this!" Yes, there was real work involved, but for many of us it never felt like drudgery, and we produced, shipped, and made our customers, managers, and other stakeholders happy. I remember a time (roughly 20 years ago) when zealous enthusiasts would proudly profess that they'd work for companies like Apple or Google for free if they could work on their dream projects.
Times have changed. The field has become much more serious about making money; fantasies about volunteering at Apple have been replaced with fantasies about very large salaries and RSU grants. Simultaneously (and I don't think coincidentally), the field has become less fun. I recognized how privileged this sounds talking about "fun", given how for most of humanity, work isn't about having fun and personal fulfillment, but about making the money required to house, feed, and clothe themselves and their loved ones. Even with the drudgery of corporate life, it beats the work conditions and the abuse that many other occupations get.
Still, let's pour one out for a time when the interests and passions of computing enthusiasts did line up with the interests of the corporate world.
My take is that there used to be a significant overlap between hobbyist-style exploration/coding and what industry wanted, especially during the PC revolution where companies like Apple and Microsoft were started by hobbyists selling their creations to other people. This continued through the 1990s and the 2000s; we know the story of how Mark Zuckerberg started Facebook from his Harvard dorm room. I am a 90s kid who was inspired by the stories of Steve Jobs and Bill Gates to pursue a computing career. I was also inspired by Bell Labs and Xerox PARC researchers.
The “hacker-friendliness” of software industry employment has been eroding in the past decade or so, and generative AI is another factor that strengthens the position of business owners and managers. Perhaps this is the maturing of the software development field. Back when computers were new and when there were few people skilled in computing, employment was more favorable for hobbyists. Over time the frontiers of computing have been settled, which reduced the need for explorers, and thus explorers have been sidelined in favor of different types of workers. LLMs are another step; while I’m not sure that LLMs could do academic research in computer science, they are already capable of doing software engineering tasks that undergraduates and interns could do.
I think what some of us are mourning is the closing of a frontier, of our figurative pastures being turned into suburban subdivisions. It’s bigger than generative AI; it’s a field that is less dependent on hobbyists for its future.
There will always be other frontiers, and even in computing there are still interesting areas of research and areas where hobbyists can contribute. But I think much of the software industry has moved in a direction where its ethos is different from the ethos of enthusiasts.
I’ve come to the same conclusion, though my line of work was research rather than software engineering. “He who pays the piper calls the tune.” It’s fun as long as I enjoyed the tunes being called, but the tunes changed, and I became less interested in playing.
I am now a tenure-track community college professor. I’m evaluated entirely by my teaching and service. While teaching a full course load is intense, and while my salary is nowhere near what a FAANG engineer makes, I get three months of summer break and one month of winter break every year to rejuvenate and to work on personal projects, with nobody telling me what research projects to work on, how frequently I should publish, and how fast I ship code.
This quote from J. J. Thomson resonates with me, and it’s more than 100 years old:
"Granting the importance of this pioneering research, how can it best be promoted? The method of direct endowment will not work, for if you pay a man a salary for doing research, he and you will want to have something to point to at the end of the year to show that the money has not been wasted. In promising work of the highest class, however, results do not come in this regular fashion, in fact years may pass without any tangible results being obtained, and the position of the paid worker would be very embarrassing and he would naturally take to work on a lower, or at any rate a different plane where he could be sure of getting year by year tangible results which would justify his salary. The position is this: You want this kind of research, but, if you pay a man to do it, it will drive him to research of a different kind. The only thing to do is to pay him for doing something else and give him enough leisure to do research for the love of it." (from https://archive.org/details/b29932208/page/198/mode/2up).
That was the original strategy for universities: teaching was the job, and research was the side-product of having some very smart people with free time. Until some "genius" decided that it was better to have professors competing for money to pay directly for their research. This transformed a noble and desirable profession into just another money searching activity.
I remember when I first learned about GNUstep in 2004 when I was in high school. It's a shame GNUstep never took off; we could have had an ecosystem of applications that could run on both macOS and Linux using native GUIs.
I'd like to give my perspective as a computer science professor at Ohlone College, which is a two-year community college located in Silicon Valley. I used to work as an AI researcher in industry (but not in large language models) before becoming a tenure-track instructor in Fall 2024.
Our core computer science curriculum consists of five courses: (1) an introductory programming course taught in a procedural subset of C++, (2) an object-oriented programming course taught in C++, (3) a data structures and algorithms course taught in C++, (4) a discrete mathematics course, and (5) an assembly language course that also covers basic computer architecture. Students who pass all five courses are prepared to transfer to a four-year university to complete their undergraduate computer science programs. The majority of our students transfer to either San Jose State University or California State University East Bay, though many of our students transfer to University of California campuses, typically UC Davis, UC Santa Cruz, UC Merced, and UC Irvine.
Because I teach introductory freshman- and sophomore-level courses, I feel it is vital for students to have a strong foundation with basic programming and basic computer science before using generative AI tools, and thus I do not accept programming assignments that were completed using generative AI tools. I admit that I'd have a different, more nuanced stance if I were teaching upper-division or graduate-level computer science courses. I have found that students who rely on generative AI for programming tend to struggle more on exams, and they also tend to lack an understanding of the programming language constructs the generated program used.
With that said, I recognize that generative AI tools are likely to become more powerful and cheaper over time. As much as I don't like this brave new world where students can cheat with even less friction today, we professors need to stay on top of things, and so I will be spending the entire month of June (1/3rd of my summer break) getting up to speed with large language models, both from a users' point of view and also from an AI research point of view.
Whenever my students are wondering whether it's worth studying computer science in light of the current job market and anxieties about AI replacing programmers, I tell them two things. The first thing I tell them is that computers and computation are very interesting things to study in their own right. Even if AI dramatically reduces software engineering jobs, there will still be a need for people to understand how computers and computation work.
The second thing I tell them is that economic conditions are not always permanent. I was a freshman at Cal Poly San Luis Obispo in 2005, when computer science enrollment bottomed out in the United States. In high school, well-meaning counselors and teachers warned me about the post-dot com bust job market and about outsourcing to India and other countries. I was an avid Slashdot reader, and the piece of advice I kept reading was to forego studying computer science and earn a business degree. However, I was a nerd who loved computers, who started programming at nine years old. I even wrote an essay in high school saying that I'd move to India if that's where all of the jobs are going to end up. The only other things I could imagine majoring in at the time were mathematics and linguistics, and neither major was known for excellent job prospects. Thus, I decided to major in computer science.
A funny thing happened while I was at Cal Poly. Web 2.0, smartphones, cloud computing, and big data took off during my undergraduate years. My classmates and I were able to get internships at prestigious companies, even during the economic crisis of 2008-09. Upon graduation, I ended up doing an internship in Japan at a major Japanese tech company and then started a PhD program at UC Santa Cruz, but many of my classmates ended up at companies like Microsoft, Apple, and Google, just in time for tech industry to enter an extended gold rush from roughly 2012 when Facebook went public until 2022 when interest rates started to go up. Many of my classmates made out like bandits financially. Me? I made different choices going down a research/academic path; I still live in an apartment and I have no stock to my name. I have no regrets, except maybe for not getting into Bitcoin in 2011 when I first heard about it.... Though I'm not "Silicon Valley successful", I'm living a much better life today than I was in high school, qualifying for Pell Grants and subsidized student loans to help pay for my Cal Poly education due to my parents' low income.
I still believe in the beauty of an undergraduate curriculum that encourages critical thinking and developing problem-solving skills, as opposed to merely learning industry topics du jour. Specific tools often come and go; my 2005 Linux system administration knowledge didn't cover systemd and Wayland since they didn't exist at the time, but my copies of Introduction to Algorithms by Cormen et al. and my Knuth volumes remain relevant.
I was just thinking about this a few days ago, but not just for the CPU (which we have RISC-V and OpenPOWER), but for an entire system, including the GPU, audio, disk controllers, networking, etc. I think a great target would be mid-2000s graphics and networking; I could go back to a 2006 Mac Pro without too much hardship. Having a fully-open equivalent to mid-2000s hardware would be a boon for open computing.
I love the BSDs; I have the most experience with FreeBSD, I regularly use macOS, and lately I’ve been learning NetBSD due to its rumpkernel.
With that said, with the decline of commercial Unix and the dominance of Linux, POSIX, in my opinion, has become less important, and in its place Linux seems to be the standard. I prefer the BSDs to Linux due to its design and documentation, but Linux has better hardware support, and the FOSS ecosystem, especially the desktop, is increasingly embracing Linuxisms such as Wayland and systemd. The FOSS BSD ecosystems are too small to counter the Linuxization of the Unix ecosystem, and I feel that Apple does not pay much attention to the BSD side of macOS these days.
I don’t expect the BSDs to die, but I do believe they’ll need to find ways to adapt to an increasingly Linux-dominated FOSS ecosystem.
That’s what happened to my 2006 Core Duo MacBook after about three or four years of use. It was an excellent laptop that was quite user-serviceable (I upgraded the RAM and hard drive), but I did have problems with the palmrests, and the Ethernet port stopped working after four years.
It was my first Apple laptop and I have fond memories of using it during my college years.
I had one of those machines in university too and had the same stained/cracked palmrests. That said, I also paid for extended AppleCare and had the whole top case swapped for free multiple times throughout the three years that the coverage lasted.
When I was a broke student I would buy MacBooks with broken palm rests for a discounted price, drop them off at Apple for a free repair (under extended warranty) and flip them for a profit. Three hours of my time turned into €100 profit. Minimum wage was €6/hour back then.
Did the same years later buying up first gen iPod Nano and trading them in for sixth gen because of the battery recall.
There is a lot of good computer science, but the computer science community today is vastly larger than it was in the 1960s and 1970s when Dijkstra, Knuth, Wirth, and others became legends. There are so many subfields of CS, each with its own deep literature and legendary figures. It’s difficult to be a modern Dijkstra or Knuth due to these factors, though to be fair, it is an impressive feat for Dijkstra to be Dijkstra and for Knuth to be Knuth even in their heydays. It’s just easier to get famous in an upstart field compared to getting famous in a mature field.
I think there are two typical paths to widespread visibility across CS subfields: (1) publishing a widely-adopted textbook, and (2) writing commonly-used software. For example, many computer scientists know about Patterson and Hennessy due to their famous computer architecture textbooks, and many computer scientists know about people like Jeff Dean due to their software.
Reading more academically-oriented literature such as the ACM’s monthly periodical “Communications of the ACM” is also a good way to get acquainted with the latest developments of computer science.
I prefer the classic Microsoft Office toolbar/menu interface to the ribbon, but I grew up on Office 97. A thought that dawned on me is that the ribbon has existed for nearly 20 years; it debuted in Microsoft Office 2007. There is now an entire generation of computer users who have never used pre-ribbon versions of Microsoft Office.
I don't know what it's like using modern Microsoft Office with no experience using toolbars and menus and then switching to LibreOffice, which still uses traditional toolbars and menus.
I prefer traditional toolbars and menus, but I remember Microsoft doing user studies when developing the original Office 2007 ribbon. It showed that the ribbon was more productive for beginners and casual users. Given that many office suite users are casual users who use word processors, spreadsheets, and presentation tools, Microsoft Office may be more productive for them than LibreOffice. It would be good if LibreOffice did user studies.
Times have changed. The field has become much more serious about making money; fantasies about volunteering at Apple have been replaced with fantasies about very large salaries and RSU grants. Simultaneously (and I don't think coincidentally), the field has become less fun. I recognized how privileged this sounds talking about "fun", given how for most of humanity, work isn't about having fun and personal fulfillment, but about making the money required to house, feed, and clothe themselves and their loved ones. Even with the drudgery of corporate life, it beats the work conditions and the abuse that many other occupations get.
Still, let's pour one out for a time when the interests and passions of computing enthusiasts did line up with the interests of the corporate world.
reply