Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
CLOS: Integrating Object-Oriented and Functional Programming (2004) [pdf] (dreamsongs.com)
122 points by mepian on Aug 12, 2023 | hide | past | favorite | 87 comments


On the related subject of the Meta Object Protocol, Gregor Kiczales talked at a meeting of the Vancouver Lisp Users Group in 2006, which they called a MOP retrospective [1]. He gave some thoughts on subjects like why more recent languages don't have MOPs. The link below is to archive.org's copy of the page and the .ogg recording of the talk along with Q&A throughout and after.

[1] https://web.archive.org/web/20060821003818/http://bc.tech.co...


I'm not a hotshot developer, but I'd guess a lot of recent languages (e.g. golang) are more structured around how to keep things simple (not necessarily easy) and uniform to where there aren't many surprises for new developers and the company can more or less turn coding into something more like an assembly line (to use an overly simplistic metaphor).

Contrast that with languages like Lisp with crazy amounts of flexibility that really cater to small teams of rockstar developers (to use another overly simplistic metaphor) and you can see the problem. Most companies want the former and not the latter.


Go was made popular and successful for only one reason, google used/made it. And this was at the height of googles coolness. So other companies followed on the bandwagon, or were made by former google developers and brought it over. The success is completely secondary to how many people like go or the features it has.

As far as simplicity I think I would disagree and agree with that. Rust is not a simple language. Other languages like F# or Scala or Kotlin that have arisen in the same timeframe as go are also not simple. But they do try and make some hard things easier, like concurrency or type safety.


A considerable portion of what made Go popular are things related to development and deployment, in addition to Plan9 derived standard library expanded with various everyday-useful modules.

The fact that you quickly got a static binary that just worked even if you removed GLIBC, that the standard library provided a way less broken network interface by default instead of BSD Sockets ("Revenge of the XTI" it should be called :D), that the language was garbage collected, and that said standard library:

- made UTF-8 default and norm

- provided bunch of often needed things like sensible HTTP client & server

- a reasonably good multiprocessing with syntax sugar from the language

All of those meant that a lot of what, for various reasons, people would be writing in Python previously for various "ops" support programs etc. would suddenly a way better experience with Go.


Well, Google also made Dart and it went nowhere. So it's not necessarily that.

Go had a simple familiar syntax that's easy to pick up, decent tooling from the get go, good enough speed and memory use, and promised easy parallelism. It also arrived at 1.0 relatively quickly and quite mature with few rough edges and a feature full standard lib.

Most "indie" languages don't have those even after 10 years of development.


Go's good parts are basically the parts inherited from Bell Labs - the channels are an echo of Plan9's 9p protocol being used everywhere, and the style syntactically and in metaprogramming capability remains close to C, but with refinements like using Unicode. Which adds up to it being a decent language for making networked software and small "one-thing-well" utilities.

If it were actually used in production on Plan9, a lot of the idioms would become that much more powerful, as more of the interfacing glue could be handled by the shell and 9p namespaces, but as it is, it gets bogged down by compatibility issues.


Some of the things that people found confusing in Go (GOPATH etc) are due to the fact that Go's native environment is Plan 9, down to using Plan 9 libraries and compilers before the full self-hosting rewrite was done


Is F# not easy? You're just writing functions and there's low syntactic (curly braces, defining classes, etc.) overhead which means a high signal-to-noise ratio. I think it might be a little more complicated than Python (because of static vs dynamic typing) but you can go quite far without introducing more complicated features.

I personally think it would be easier to teach a beginner F# (just the language, not the ecosystem like ASP.NET or Spring) than C# or Java.


I tried learning F# with nearly zero .NET knowledge and found it a horrific experience. In comparison, Python, Perl, and many more languages were a cake walk. It wasn't the FP paradigm, but doing anything useful without already knowing the .NET libraries and C#. I have most of the books and they're worthless to me as they assume I'm a seasoned .NET enterprise developer. I eventually gave up after trying several times.


I would argue that F# is far more simple at its core than the core of Rust, Scala, and Kotlin. It only gets complicated when you're getting into advanced features, but it is still a very pragmatic language.


I don't really buy into this view of Go. Go doesn't keep things simple and uniform. It's more limited in many ways than other languages, but it's feature set is largely arbitrary, and not particularly simple. Limited and simple are not synonyms.


> Go doesn’t keep things simple and uniform

This is a bold statement, care to back that up with examples? In my experience go is the most uniform language I have used due in large part to having very good abstractions available in the standard library, as well as the way interfaces are used.

> Limited and simple are not synonyms

Agreed. Simple refers to how much context you me to keep track of as opposed to the capabilities of the system (which is how I interpret limited). What aspects of go you you find limited?


> This is a bold statement, care to back that up with examples?

I would argue it's less bold than claiming Go does keep things simple and uniform. It's weird to require someone to prove that a language doesn't have nice properties.

In any case, one glaring example of non-uniformity is how slices and maps are magically generic. I would also argue that Go's error handling solution of returning a pair is not very simple. Returning two things when only one of them should ever be populated is incidental complexity, and adds cognitive load. Something like a sum type is simpler, as you only get a value or the error.


It's not about if it has or doesn't have nice properties. It's about going against conventional wisdom. Conventional wisdom is often wrong, so go right ahead, but you need at least some evidence in a public forum when you say things that are generally established not to be true.


People tend to say Lisps cater towards smaller dev shops. Clojure spends a lot of time trying to keep things simple though. They are relatively opinionated about a lot of stuff. I've seen some relatively large projects with a lot of devs using it too. So I don't think your statement is universally true


I prefer Go over Lisp on the cover because I prefer the explicit, low magic code of Go with all its boiler plate and using my brain power on business problems not what makes me more productive or feel better / clever while doing my work.

Horses for courses - I will never be a language designer / developer with this attitude.


> I prefer Go over Lisp on the cover because I prefer the explicit, low magic code of Go with all its boiler plate and using my brain power on business problems not what makes me more productive or feel better / clever while doing my work.

This seems like a false dichotomy. Boilerplate-heavy code is the opposite of focusing on business logic. Incidental complexity ends up being intertwined with business logic everywhere. More powerful languages allow programmers to do clever tricks and make a mess, but they don't require it. Those languages also allow programmers to abstract away incidental complexity and expose business logic more clearly.

I think the future of programming languages is going to be in finding a balance that allows powerful abstractions while still making the semantics as clear as possible.

Though I have to say, I don't really see what Go buys you over something like OCaml, which is still very explicit (in many ways more so than Go) while still providing powerful abstraction facilities. It even competes with Go on the compile-time front.


> Though I have to say, I don't really see what Go buys you over something like OCaml, which is still very explicit (in many ways more so than Go) while still providing powerful abstraction facilities.

I mean, the thing it provides is that it doesn't provide powerful abstraction facilities. There's a fundamental limit to how bad the code can get. You can wind up with really gronky code, but you can't experience the horrible abstraction collapse that is experienced in extension heavy OO programming or certain flavors of functional programming.

You also can't get the heights of truly amazing abstractions, but that's a trade-off a lot of devs are willing to make.

Edit: I should clarify. The above reasoning is generally only true for enterprise projects. I'm a pretty big advocate for go in enterprise settings for the above reason as well. But in my own time, when I'm programming for fun, I tend to go with elixir/common lisp/scala or something else fun.


> I mean, the thing it provides is that it doesn't provide powerful abstraction facilities. There's a fundamental limit to how bad the code can get.

Right, and I'm suggesting that Go isn't an improvement over OCaml in that regard. The abstraction facilities on OCaml are very explicit/non-magical. It basically comes down to parametric polymorphism (which I think Go finally has now? Or will soon?) and the module system/functors.


> The abstraction facilities on OCaml are very explicit/non-magical.

I'm talking about PPXs when I talk about magic in ocaml.


I guess I have a hard time seeing how PPXs are worse than `go generate`.


My understanding is that PPXs allow arbitrary AST manipulations, which allow syntax extensions, whereas `go generate` just allows code generation.


Go hits that niche of "read from a queue/endpoint, write to a queue/endpoint", and there is an awful lot of that going on. It's just practical for getting (some kinds of) work done.

> I don't really see what Go buys you over something like OCaml

Easy installation on windows for one.


> using my brain power on business problems not what makes me more productive

This doesn't make any sense. It's like saying you prefer hitting rocks together to make music because you don't have to use all your brain power at operating something complex like the sax or the piano.

Tool enable, or kill, creativity. I'd rather not use my brain power to fight an obtuse tool, like Go can be, but instead choose to become a violin, or Lisp, virtuoso. The real issue is that many people don't want to become virtuoso, but just want to learn as little from their tools to make rent. That's fine, but let's not blame the tool, then.

You know who doesn't like rockstars and soloists? Large companies, so they make us feel as if we're weird, unruly, opinionated, just because we don't conform to the Standard Programmer mould to fit inside the cubicle. Whenever I hear someone mention rockstar engineers, I wonder if they mean it as an insult, instead of something to be encouraged and celebrated.


To abuse the metaphor - lisp and lispers are completely tone deaf. There's an enormous number of people writing code that don't care one wit about "elegance" or "power" or "beauty" or whatever weird intangibles lisp has pretensions to. These people (such heretics) in fact don't care about software unto itself at all. They only care about using software to accomplish some goal, finish some task, solve some real (earthly) problem.

> The real issue is that many people don't want to become virtuoso, but just want to learn as little from their tools to make rent.

I would love to be a virtuoso pianist or painter or singer. I can't fathom the desire to be a virtuoso programmer - it would be like being a virtuoso widget assembler. I mean sure our bosses wishes all of us were virtuoso widget assemblers. But me by myself? Nah. And also don't give me that protestant work ethic shpiel about taking pride in your work - I have plenty of other things I can be proud of instead.

> That's fine, but let's not blame the tool, then.

A tool that no one uses isn't some lonely pitiable misunderstood emo child - it's just useless. It's dead simple: if lisp were the key to building fantastically successful software we would all swallow our pride and use the stupid parens language. I don't know how such a clear and obvious argument never gets through to people in language communities. If you have to fight, for decades, to convince people that your tool is superior and still no one is using it, you should reflect deeply on whether you're wrong.


I hardly think being a virtuoso programmer is something to shun. Not everyone has to do it but it's not a bad thing. Also, as the speed of processors has basically stopped but we get more and more cores, things like Clojure's atoms, STM, and core.async make it really easy to do parallel computations without the utter insanity that using mutexes or Futures or whatever leads to. Clojure definitely makes it way easier to write clearly correct multi-threaded code in very few lines of code. You would be shocked what you can get out of the box. So there is a lisp that targets exactly what you want: something for laymen to use to effectively get the job done correctly and way easier than the competition. I've seen FAANG engineers screw up simple locks etc. If Clojure can hand you tools to get that done then why wouldn't you be trying it (despite your aversion to parens).


> I can't fathom the desire to be a virtuoso programmer - it would be like being a virtuoso widget assembler.

Only if all the programming you do is assembling widgets, which given your tone might be the case. For some it is a form of art, just like a paintbrush or a musical instrument.

Imagine if we were in a bizarro world where you go to violin university not because you like it, but because it's a good career that pays well; after 20 years doing 40 h/wk playing the violin for a nameless Big Orchestra playing day in day out what is frankly trite crap for pointy haired composers, you'd find it very hard to imagine a world where people play the violin for fun.

I sense a LOT of hostility on your part against me defending what is just, simply a tool, because Big Orchestra has killed your creativity and imagination. Again, it is not the tool's fault. Nor am I fighting for the world to adopt this tool as you claim. It's not a tool that meshes well with pointy haired bosses, and sadly they rule the world.

In any case I'm not a "lisper", which seems to be a convenient category to put people in so they can be berated by your enormous frustration. I doubt all this rage comes because of deep seated trauma with S-expressions themselves. Glancing at your recent comment history, it sounds like you might need a holiday, Internet stranger.


> Only if all the programming you do is assembling widgets, which given your tone might be the case

Oh hello there my old friend no true Scotsman. I'm a compilers researcher so gee hmm I wonder if maybe possibly I'm not just simply aggrieved.

> For some it is a form of art, just like a paintbrush or a musical instrument.

There's no such thing. Even me, a literal researcher, ultimately answers to the capital motive (the university's). You're not an artist. We are not artists in so far as we're paid for a service.

>Imagine if we were in a bizarro world where you go to violin university not because you like it, but because it's a good career that pays well; after 20 years doing 40 h/wk playing the violin for a nameless Big Orchestra playing day in day out what is frankly trite crap for pointy haired composers, you'd find it very hard to imagine a world where people play the violin for fun.

This isn't bizarro world or bizarre. This is the world. I'm very good at compilers research but I did not pick it for my PhD because I enjoyed it (as the username suggests, I did math in undergrad) but because it pays the best. I trade time for money just like you, just like everyone else.

> I sense a LOT of hostility on your part against me defending what is just, simply a tool, because Big Orchestra has killed your creativity and imagination. Again, it is not the tool's fault.

Whenever someone says this I'm like I didn't come to you attacking lisp but you did come here and start a debate about whether people just want to pay rent. So I'm hostile supposedly for responding to your condescension but you're not hostile for... being condescending? The math isn't mathing as they say.

>It's not a tool that meshes well with pointy haired bosses, and sadly they rule the world.

You're aluding to the deep cultural depth and brilliance of a ... comic strip. And unironically. You really think this is a valuable contribution to any conversation?

> I doubt all this rage comes because of deep seated trauma with S-expressions themselves.

I'll say it again: there's no rage or trauma, there's only you and your condescending remarks about paying rent and lack of virtue or virtuosity or whatever. If you hadn't said that people "just want to pay rent" I wouldn't have bothered to say anything at all.


> people writing code that don't care one wit about "elegance" or "power" or "beauty" or whatever

Lisp can’t force you to take full advantage of what it’s good at, and if you prefer not to, it’s probably a waste of time. As the saying goes, you can write Fortran in any language.


>This doesn't make any sense. It's like saying you prefer hitting rocks together to make music because you don't have to use all your brain power at operating something complex like the sax or the piano.

It's more like saying that he'd rather play the piano and get musical results faster, than study some instrument much fewer people use, with 200 microtonal keys, 10 pedals, additional sliders and rotary switches, and a microprocessor that you can study a 500 page guide to learn and create custom sounds for, and the music scores for which look like line noise.


This is correct. Thank you for the defense.


One of the ideas of Lisp is that you can extend/change the language to bring it near the actual domain level. The so-called "Embedded Domain Specific Language" is common in Lisp.


My gut says this is the actual reason why lisp remains niche; everyone wants to make their own DSLs all the time.


> everyone wants to make their own DSLs all the time.

Including people not working in Lisp.

- Code generation.

- Frameworks.

- Embedded interpreters.

The Unix project was a hotbed of DSLs: shell, lex, yacc, make, troff, pic, eqn, awk, dc, bc, ... the C preprocessor started as an external utility.

Javascript, Python, Ruby: all hotbeds of frameworking, monkey-patching, generating ....


The problem with this is that it's the inverse of "rule of least power"; making the most powerful programming language possible means there's no way to know if your program is actually correct and no tools to find out.


Your gut is wrong in this instance.


In my VERY limited experience this is only maintainable in very capable hands. They all start off OK and then creep and bloat.


I'd argue that's usually the case, because writing little language inside your normal language is full of problems, compiler writing is neither taught much anymore nor well supported (this is not a dig on the programmers - it's more that it feels like we're told that this and that is magic and should be left for the wizards).

So then you finally find yourself needing to do something that requires DSL, and you quickly end up on the annoying self-destructive treadmill of growing from small simple grokkable config file to something that should be put out of its misery (and the misery it forces on developers).

FWIW, some languages make it (IMO) much easier to handle that, if one decides from start to "embed" the DSL - Lisps are infamous on this, but this works just as well with Ruby, builder patterns in various languages (this can be a bit more prone to error though), or the less liked languages like poor TCL.


To be fair, I don't think I'd want someone who's not very capable working on anything important, no matter what language they're using. I know it happens all the time, but I don't think that's a particularly compelling argument.


In my not so limited experience, this is true of all programming.


Go concurrency is lots of magic.


It's also dramatically more useful than CLOS, and its real-world impact has far outweighed that of CLOS. Having excellent concurrency features built into a language turns out to be pretty important.


There is a big difference between "a lot of recent languages" and "golang".

Also Google solves Google's problems. They do not give a fuck about rest of the world. If what they do helps you personally then adopt it. If not let them go play hide.


For anyone finding CLOS interesting, "The Art of the Metaobject Protocol" is the next logical step. A good way to dig deep into how CLOS is built.

It should be a mandatory read for anyone looking to design a programming language with object concepts.


I think I'd recommend reading Sonya Keene's Object-Oriented Programming in Common Lisp: A Programmer's Guide to CLOS first though. I really don't know any other ways of learning all the things like method combinations and the like that make the MOP so powerful. All the CLOS tutorials I am aware of lack a real discussion of these features of CLOS.


unrelated to your point, Sonya keene's book is the only one I know that was authored and typeset using Symbolics's in-house publishing system Concordia. (besides symbolics documentation sets of course)

Concordia (and its display component document examiner) had a pretty novel hypertext approach to authoring. all the content was made up of individual notes, that you could link together either with explicit links, or implicit one-follows-the-other links. I don't know if that's how Sonya used it, but when I authored some documents using Concordia, I discovered that it was very easy to focus on addressing specific points in almost a throwaway fashion. write a note on some subject you want to address. if you don't like it, write another take on the same subject. now you have the option to put either note into the book flow, refine them in parallel, and defer editorial decisions to the very end. the process is very reminiscent of explorative programming in lisp, so in other words symbolics people figured out how to write books in the same way as they write their code. ultimate vertical integration.


"Lisp Lore: A Guide to Programming the Lisp Machine" by Bromley&Lamson was also written using Concordia AFAIK. The Symbolics documentation also has Jensen&Wirth "Pascal User Manual and Report" and Harbison & Steele, C: A Reference Manual.


> Symbolics's in-house publishing system Concordia

What you describe regarding Concordia sounds very intriguing. I can search for information on Concordia, but do you have any goto references (books, papers, videos) on it that you'd recommend?


I once recorded a demo video, running Concordia on an actual Symbolics Lisp Machine.

https://vimeo.com/83886950


Cool video, interesting. Looks though like a very complicated way of creating and entering the content, compared to just typing in a buffer with some markup language. When I saw the video first I thought it was just textual hyperlinks, sort of what we with clickable text in Emacs. Info and help-mode are Emacs apps that uses it quite heavy. But then I looked up Concordia on Wikipedia, and I see it has same ancestor as texinfo :).

But very interesting to see, I'll watch your other videos when I have more time, it is a bit of history. Thanks for recording them and uploading videos. Is that machine still alive and running, or did you record it on your Linux port?


One doesn't need to use the menus, one can use the keyboard commands. The UI provides several ways to interact: key commands, interactive command line and menus.

> I see it has same ancestor as texinfo

Genera comes with a Scribe-based markup language and formatter.

> Is that machine still alive and running, or did you record it on your Linux port?

I made this video years ago on Lisp Machine. The new emulator for the Mac & Linux is many times faster and runs silent on something like a MacBook... Thus it's ,uch more convenient to use that for a demo, unless the software does not run there. The emulator has its own native code format and, for example, lacks emulation of the console hardware (graphics hardware).


> One doesn't need to use the menus, one can use the keyboard commands. The UI provides several ways to interact: key commands, interactive command line and menus. > Genera comes with a Scribe-based markup language and formatter.

You mean, the humanity has not gone too far away when it comes to computer-human interaction back from those days? :-). Just kidding; that sounds like they were quite modern back in 80's. I saw the other video on YT about their graphics software and hardware. While it looks relatively simple compared to modern image editors, modellers, fx and animation packages, it still feels like they had all the right ideas. What do you think put them out of the business? Just the economy or some other more technical reason?

> The new emulator for the Mac & Linux is many times faster and runs silent on something like a MacBook.

Yeah I saw another video, and saw "machdd" or something similar on the modeline somewhere, so I assumed you made it on a mac.

> lacks emulation of the console hardware (graphics hardware)

That explains why all the demos are black and white.

I don't have so much time to install and configure virtual machines and programs, but one beautiful day I'll try it, just for the curiosity; I have seen the repo on GH.


> What do you think put them out of the business? Just the economy or some other more technical reason?

The main reason was the end of the cold war and the end of the high-tech war. Means there were too few commercial customers. Where they had commercial customers (like the Graphics & Animation business), there was a disruption by other technology, like SGIs (RISC CPUs with powerful graphics accelerators) and also Windows NT. The graphics software was sold to Nichimen and ported to SGIs and Windows NT.

> That explains why all the demos are black and white.

All the Symbolics early consoles were black & white, so all the software was using b&w. Typically the machines had an additional color screen, then with an additional color graphics board. All driven by Lisp. But the megapixel color screens and graphics boards were very expensive. They also might have been too slow to use as an interactive console screen.

The emulator support graphics. It's X11 and one can use color graphics, but the graphics & animation software hasn't been ported to X11 AFAIK. It's just that the normal tools don't use color in their UI, though there were applications which used color.

> I have seen the repo on GH.

Don't expect too much. That's an old, unsupported emulator, which has a bunch of technical problems.


> Means there were too few commercial customers. Where they had commercial customers (like the Graphics & Animation business), there was a disruption by other technology, like SGIs (RISC CPUs with powerful graphics accelerators) and also Windows NT. The graphics software was sold to Nichimen and ported to SGIs and Windows NT.

And now even SGI is out of business. It is a little bit unfortunate, but I think there is a history lesson to learn. Symbolics, SGI, Sun, Xerox, IBM, AT&T, they are all gone from the software business, more or less. I mean IBM, AT&T and Xerox are alive, but they are just a shadow of former self they once were, at least on the software front. Seems like all companies that target high-level industry with big profits, and ignores the consumer market are fading away.

Compare that to Microft which exploded in market share after their Dos/Windows and Intel which exploded after their 8086/8088. It just shows how important it is to put the technology out to consumers. Not because mass consumers will create so much value, they will that too of course, but foremost they will learn how to use the technology and once they come to businesses and have to solve problems, they will use it. I think that is a problem Symbolics faced. They run on dedicated hardware that probably was a multum in price and was used for specialized problems, while worse technology was cheaper and more accessible. People used what was accessible and when a generation grew up and went to work of course it is cheaper to let them use what they have learned then to buy specialized hardware and train them in specialized language. I think same thing happened to SGI when big graphics software names released their software for Windows. I think it is a circle, or a rolling stone. It is important to put the technology and knowledge out in the hands of people.

It is a bit sad that LW and Franz are keeping their software behind the locked gates instead of letting them out in the free. I bet some middle-tier manager is sitting at the Boeing as of this writing and trying to figure out how to save $$$ by cutting out that crazy expensive expert-knowledge Lisp thing out of their software stack, just to save some $$$ and get promotion or a bigger bonus.

If LW and Franz are going to survive and not go same way as Symbolics, Sun & Co, they should probably rethink their strategy of licensing their stuff free for GPL/non-commercial use, similar as Qt and some other companies do. Perhaps SBCL is good enough, but Lisp community needs more and better tools. In expert hands Emacs is s superb tool, but it is not the average mass tool.

It is a bit shame. I think Lisp is such a great tool for software engineering and applications development, but it is so underused because the knowledge pool is so small and the best tools are locked away behind the pricey tag seems like. If/when those two guys are gone, LW and Franz, Lisp will be seen even more as an academic exercise rather than a useful practical tool.

I don't know, perhaps I am wrong, just thinking loud.


> If LW and Franz are going to survive and not go same way as Symbolics, Sun & Co, they should probably rethink their strategy of licensing their stuff free for GPL/non-commercial use, similar as Qt and some other companies do.

I'd think there were like 30 commercial implementations of Common Lisp. LispWorks and Franz are still alive. None of the others are. Personally I'm happy that they exist and fear what you propose would kill them very quick. There are also a few inhouse implementations "alive".

> and the best tools are locked away behind the pricey tag

If there would be a business opportunity someone else can pick it up.

LispWorks reused bits and pieces of CMUCL. But the stuff they've added provided real value: robust ports to different platforms, an extensive implementation and a portable GUI layer.

If someone sees a business, they could easily layer something on top of SBCL or provided other improvements.

Scieneer tried that with CMUCL by adding concurrent threading for multi-core machines.

Clozure CL was alive while there was the expertise of the old hackers. Once they were gone it was difficult to keep it ported to new platforms and to fix hairy bugs. They tried to have a business model with an open sourced variant of Macintosh Common Lisp.

We'll see dev tools financed by big companies like Microsoft, Oracle, Apple, Google, ...

Then there are a bunch of companies trying to provide tools (Intellij, ...) or alternative languages (Scala, ...).

The specialized niche languages tend to have capable implementations with large price tags. See for example the commercial Smalltalk implementations of Cincom or SICStus Prolog ( https://sicstus.sics.se/order4.html ).

Other financing models tend to be fragile... or depend on other sources... like research/academic funding.


> LispWorks and Franz are still alive.

Yes, but what makes us believe they won't go the same faith as all the others?

> fear what you propose would kill them very quick

Why? Is their technology that bad? I don't think so. I think they need to let people use their stuff, show to the masses the good things about Lisp and their tools and let the masses learn how to use those tools.

> If someone sees a business, they could easily layer something on top of SBCL or provided other improvements.

It is not just. Layering something on top is a lot of work :). It is software, everything is possible, but who has the manpower and time? However people do layer stuff on top. We do see people making cool and interesting stuff, but it is relatively few enlightened persons. We don't see a mass movement; perhaps masses need to see a "killer app" or something, like when JavaScript got Chrome and went from verbotten on every computer to No. 1 language (more or less)?

I think the business is made by giving the technology to people, and letting them use it. Once it is in use by individuals and people realize the power of their tools, I believe it will see more use in businesses too. As I said; I think it is a "vicious circle" as they say here in Sweden.

> Once they were gone it was difficult to keep it ported to new platforms and to fix hairy bugs.

Yes, that is a normal thing. Once the experts are gone, the platform is dead. To survive, a platform needs to attract new people and bring them up to expertise level. But every platform also has to adapt and as well to be well documented. I wish to make an extension for sbcl, I don't see any writing anywhere on how to proceed, so I have to look through the source code. It is not impossible, but is more tedious than it perhaps should be. The best thing with Emacs is actually the documentation, and the openness. I think. I don't know for sure, but it seems so.

> The specialized niche languages tend to have capable implementations with large price tags. See for example the commercial Smalltalk implementations of Cincom or SICStus Prolog

Anyone still using Smalltalk? :) Yes, of course, I agree. What you describe is how the business was done, but I don't think the history speaks in that favor. I don't know, all this are just my speculations of course, but I think that is an obsolete view on the business. It seems that all those specialized companies that target big biz are sooner or later out of that biz. What would it mean for Franz to loose Boeing? Probably quite a lot. I don't know. I am not sure Oracle is doing that great as they did in the past either. There is a lot of inertia too; big customers can't easily switch. I mean, Cobol is still there, but it does not mean Cobol as a platform is doing well.

> Other financing models tend to be fragile... or depend on other sources

I didn't mean they should switch to some other form of making business and financing. I just mean they should let their tools go free for the masses, for people making open source code for non-commercial use. For businesses they should still sell and charge of course. I don't know, perhaps I am wrong; but which hobbyist like me will pay LW 400€ for the basic license? Or whatever is the price now. They wont make money out of hobbyists, students, indie devs and alike, so they can as well let those people use the thing for free for non-commercial and open source use like other companies do. Perhaps if they have good tooling, gui builders and so on, people will use it to create some interesting stuff, more people would learn how to use their software, and that I mean would play in their favor in the long run.

By the way, to mention is that Microsoft, Oracle, Apple, Google gives away their tooling basically for free, because they have realized that they want people to learn and use their tools. I don't know, perhaps there is not money in traditional tooling like GUI builders at all in the world of web technologies.

Don't get me wrong, I don't wish them anything bad, but I think that the Lisp community is so small and fragile, and it would perhaps help with the better tools. I think is a shame because it is a nice language for rapid prototyping and development with lots of potential that the humanity perhaps is slowly loosing; but now I am perhaps bit too dramatic and cheesy :).


> Yes, but what makes us believe they won't go the same faith as all the others?

Switching to an open source model would mean that only 1% of the users would pay anything. To make that viable you need a good business model for a much larger or a different market. The business model is vastly different with a vastly different product.

Just open sourcing the thing is an easy way to kill an existing company, which now has a small, but paying customer base.


No, no; it wasn't was I meant; I clarified in the last one; release things for free for already non-paying customers for non-commercial use only, Qt style. I am hard to believe Boeing would count in that group :).


How do you enforce that? Experience shows that most companies ignore that code is only for non-commercial use and it's hard to enforce for small companies.

I also doubt that all customers are like Boeing.


The issue with console hardware is that some applications use old code paths that directly call into the most basic framebuffer Symbolics Lisp Machines had, and that is not supported on X11 - thus said software doesn't work neither on OpenGenera, nor on UX400, UX1200, and NXP1000 physical lisp machines (all of which lack console hardware and use X11.

The only exception is AFAIK MacIvory, which has special subsystem that emulates console hardware on top of Macintosh toolbox calls done over Sun RPC.


The MacIvory has the option to use a NuBUS color graphics card directly from Lisp: a NuVista board.


Yes, but that's used by COLOR package, whose interfaces are IIRC mostly available on X11 client as well (minus acceleration).

The TV package in the full was redirected only on MacIvory, not on Unix-based setups, notes from OpenGenera suggest that the plan was to fix the application packages to use new interfaces instead.


Thank you!


I'm glad Rainer recommended his video, because I learned Concordia purely by word of mouth and exploration. There's a paper by Janet H. Walker, the principal on Document Examiner/Concordia for a hypertext convention https://dl.acm.org/doi/10.1145/317426.317448, but it's more about DE.


Agreed. It's easier to learn how to do practical programming with CLOS from Keene's book. AMOP is more useful for those who want to know more about how things work under the hood, or who maybe want to build their own implementation of CLOS or something similar, and easier to understand after you've absorbed Keene.


The book "Object-Oriented Programming: The CLOS Perspective" is a better intro than just diving into AMOP. It's essentially a collection of papers, but it really helps contextualize a lot of the early design thinking, gives a gentle introduction to the MOP, compares CLOS with other languages including C++, and showcases some application uses back then.


One of the core points about metaobject protocols is that while it's important to language design, a good language to me also exposes & makes flexible the object system to users of the language too.

From a review (https://www.adamtornhill.com/reviews/amop.htm),

> The metaobject protocol behaves like an interface towards the language itself, available for the programmer to incrementally customize. Just as we can specialize and extend the behavior of the classes we define in our own applications, we can now extend the language itself using the very same mechanisms.

One of the points that has rather surprised me is that we devs have not more broadly explored what we could do with our metaobject protocols. We havent had a boom in metaprogramming, we haven't seen a largescale return of AOP; we've been letting objects stay dumb unextended objects for a long time now.

The one sizable exception I have on my radar is React's Higher Order Components, where components/classes were wrapped/composed in other classes. Slices of object behavior were spliced into place.

Now that's replaced with hooks, which invert the relationship, making the function up front composed all behavior it might want as hooks that it calls.

I don't know enough about how Rust macros are made & used. My vague impression is they are very scarcely considered. Maybe I just missed the discussions but I'd expect there to be lots of blogging about this topic if metaprogramming here was really as fertile a field here as to be expected.


> One of the points that has rather surprised me is that we devs have not more broadly explored what we could do with our metaobject protocols. We havent had a boom in metaprogramming, we haven't seen a largescale return of AOP; we've been letting objects stay dumb unextended objects for a long time now.

The other day I showed some newer devs how they could replace a good 10+ lines of for/index/loop/conditional code whose purpose was to find either the next or previous elements following an element that matched a condition with wrap around semantics. I did it using a zip and a filter. Good old “functional” approach. That’s cool they said, and then got rid of me as quick as they could. They had been close to pushing their changes. And indeed did.

It used to be the case that a sizable number of my peers were interested in furthering their craft. Those that “just ship it, it works, ok?” We’re tolerated. Now it seems that enthusiast peers that are interested in “exploration” and “discovery” are few and far between. I’m surrounded by people that do their hours and clock out tickets.

So no, I am no longer surprised by this lack of meta programming interest. Or any other “higher level” linguistic pursuits.


You might find this interesting:

Metaobject Protocols for Julia https://drops.dagstuhl.de/opus/volltexte/2022/16759/pdf/OASI...


> We havent had a boom in metaprogramming, we haven't seen a largescale return of AOP; we've been letting objects stay dumb unextended objects for a long time now.

As user of Java, .NET and C++ ecosystems, I beg to differ, those subjects are quite present.


Agreed - another good resource is the original Flavors paper. Many of the parts are there like multiple inheritance and method combination but without the MOPs and multiple dispatch.

https://www.softwarepreservation.org/projects/LISP/MIT/nnnfl...


I essentially use "abbreviated" CLOS.

In CL, defstruct structures walk on both sides of the CLOS line. They're not as robust as full boat CLOS classes, but they're not just a bag of fields either. They fully participate in CLOS dispatch, you can do composition with defstruct, you can create custom constructors and such. So, they fit right in with many "dumb object", or perhaps "slightly clever object" use cases without all of the wordiness of a full CLOS class.

Mind, I'm not an experience CLOS user. I'm not choosing defstruct over defclass for any other reason save that I find it less verbose and it offers the functionality that I need. I'm sure my code might be idiomatically different were I to use true CLOS classes instead of defstructs, but I can't say right now that I'm "missing" anything.

But I do think it's cool that defstructs participate very well with CLOS and that's it's not a "my way or the highway" type of system, it just integrates really well with the rest of CL.


> They're not as robust as full boat CLOS classes

Structures support single inheritance, CLOS classes support multiple inheritance. The main reason for structures is that they are faster to use by default - but they are less flexible. For example: CLOS classes support updates of existing objects when the class definition changes, structures don't support updates.


My feeling on this is that Common Lisp implementations should support classes in a way that idiomatic usage is just about as fast as structures. This is nontrivial, but I think it could be done.


There was a time (the early 1990s) when structures were dependably and significantly faster than classes, and you had to use structures for any application that was even remotely speed-sensitive.

I haven't metered CLOS in a while, but since modern hardware is around 100x faster now and modern Common Lisp compilers are a lot better, I suspect structures are no longer faster than classes or if they are, classes are fast enough that their extra flexibility is more important except in a very small set of apps where raw speed is paramount.

I would encourage CL programmers who stick to structures for speed reasons to try rewriting some of your code with classes and meter it. You might be pleasantly surprised.


The problem comes from dynamism. Accessors to class slots are generic functions, and in current implementations cannot be inlined in general, as new methods can be added to them, even if the class of the argument is known statically. Even direct access to slots suffers from the effects of dynamism, since classes can be redefined at any time and the slots moved around.

So, what's needed is a general way to get efficient code for such things even in the face of dynamic redefinition. This is possible, using generated code snippets that are relinked into code when definitions change, but I don't know of any Common Lisp compiler that currently does this.


AFAIK most of the current CL compilers memoize dynamic dispatch as well as slot vector lookup. They clear these caches if a relevant class or method gets redefined.

Of course this precludes the use of some MOP functions like slot-value-using-class; if you use those the system won't cache the slot lookup mechanism.


Yes, implementations will do what they can, but none of that can be as fast as a simple structure field access when the type of the structure is known at compile time. That compiles to a single instruction.


CLOS is more than fast enough for games (see e.g. Kandria, there are comments on HN about it, though it's easy enough to do one's own benchmark and see how easy it is to get hundreds of frames per second even with thousands of objects updated and drawn through methods every frame), and if you do have some numerical generic functions in a tight loop you can optimize them further with libraries like https://github.com/alex-gutev/static-dispatch The book Object-Oriented Programming: The CLOS Perspective (1993) even includes as its final chapter/paper "Efficient Method Dispatch in PCL" which is essentially a memoization technique, so it's been a concern (or because of such techniques, not the biggest concern) for a long time.

The dev experience of using defstruct is bad enough when I want to change something about it that I just default to defclass. I can see some people not liking the somewhat more verbose syntax (I don't mind myself) but of course that's a barrier trivially removed...


Someone was looking into doing this on Robert Strandh's lisp implementation. The existence of first-class global environments on it allows for saving off the entire environment on a defmethod and recompiling all methods for static dispatch every time a new defmethod occurs. This was being talked about at least 5 years ago; not sure if any work has been done on it in the interim.


Strandh also has his Call Site Optimization idea, which allows dynamic redefinition of things without having to recompile everything (just replace jumps to small code snippets at the call sites.)


A bit more from RPG about CLOS:

https://dreamsongs.com/CLOS.html

The CLOS spec was a collaboration of:

Linda G. DeMichiel, Richard P. Gabriel (short RPG) : both from Lucid, Inc.

Sonya E. Keene, David A. Moon : both from Symbolics, Inc.

Daniel G. Bobrow, Gregor Kiczales : both from Xerox PARC

The actual CLOS Specification:

https://dreamsongs.com/Files/concepts.pdf

https://dreamsongs.com/Files/Functions.pdf

Published here: https://dl.acm.org/doi/10.1145/885631.885632

There is also a CLOS MOP spec, but that's not a part of the Common Lisp standard:

http://lispm.de/docs/Publications/Lisp%20Language%20Manuals/...


Do any of the languages that are popular today have object systems that are as powerful and flexible as CLOS?


Julia, perhaps? Although calling it popular is a stretch maybe. Its multiple dispatch system is even more ingrained to the language than CLOS (all functions are generic by default). It's very much a Lisp in Matlab's clothing.


... ish? There's a lot of furniture that's bolted into Julia's runtime in terms of task management and so on, that isn't really like a Lisp runtime? It doesn't have that same "It's Turtles All the Way Down" self-consistent vibe that a Lisp has.

On paper Julia looks like a great piece of kit. Once I worked in it professionally for a couple months, it started to leave a bad taste in my mouth.


Ruby has a meta object protocol. The net effect of the language isn't as powerful as Common Lisp since you don't have code-as-data in method definitions, but Ruby's object system is comparably powerful as CLOS (though made in the image of Smalltalk) it comes with a meta protocol for methods, objects, classes, modules, even lexical variables.

https://ruby-doc.org/3.2.2/Object.html

https://ruby-doc.org/3.2.2/Method.html

https://ruby-doc.org/3.2.2/Class.html

https://ruby-doc.org/3.2.2/Module.html

https://ruby-doc.org/3.2.2/Binding.html


Julia, as answered on the sybling comment.

Clojure has a subset, and I guess Racket is quite flexible with its language modeling framework.

Going on an edge, C++ is also quite flexible when one looks beyond of plain classes, and combines them alongside type parameters in templates, and compile time reflexion via template metaprogramming and concepts (although it is quite clunky).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: