Am Di., 3. März 2020 um 16:27 Uhr schrieb Lassi Kortela <>:
> No one would forbid you or R7RS-large to solely use #f as defaults in
> their APIs. For most APIs it is possibly a very good idea based on your
> arguments.

While I don't want to stop the R7RS-large project from making which
decision it wants, I've tried to think of even one example where a
non-#f default turned out to be better, and I can't. That doesn't mean
there isn't one. Perhaps an "escape hatch" in the language is good.

> As a Scheme programmer, I value my freedom. :)
> Besides, you cannot really force people to write good APIs. Even with
> keyword arguments, the possibilities to design rotten interfaces are
> endless...

That's definitely true. Arguably, any use of keyword arguments is
non-ideal design. They exist because humans cannot design ideal systems
as large as Emacs or AutoCAD. At some point you just get tired and have
to ship something, or you would have to predict the future to know which
design is right but cannot. Keyword arguments are "damage control" for
these situations: if you can't make something ideal, you at least have a
much better change to avoid obvious mistakes that cannot be fixed later.

That being said, some language features are notorious "traps" - due to
common traits of human personality, they cause people to spend a lot of
time in unproductive pursuits. Regular expressions are a classic one. I
argue that intricate lambda lists are another. "Hey, this language is so
poewrful that I can design _just the right_ arguments for my procedure!"

I've done that more than a few times, and the arguments never turned out
to be "just right"...

>      > Scheme is a better language because '() and #f are different objects.
>     After a year of intensive Scheme usage, I'm still not sure which way is
>     better. Coming from CL, the lack of nil-punning took a lot of mental
>     adjustment.
> CL and Scheme look like very different languages to me.

True. Everything is so messy without named let :)

> Logically, '() and #f are very different things. Why lump them together?
> You lose abstraction and expressiveness.

Sometimes that's the goal. Differentiating things is only good if the
difference is meaningful to the programmer. If it's meaningless, then
it's confusing that there's a difference.

If there is a difference but it doesn't look meaningful, it may also be a sign that one hasn't fully grasped the concept behind.

For example, to many programmers the difference between symbols and identifiers may not be meaningful. Say, because they come from a language without hygienic macros.

Once you have understood the difference, you have understood the concept of hygienic macros.

JavaScript's null vs undefined is a good example. I can never remember
what the point of "undefined" is, and in what situations it's different
from null.

In Scheme, you could transparently add an "undefined" value as well. Just add to the specification that "(if #f #f)" and procedures that are solely called for side effects shall return "undefined" and not some unspecified value. It may not make much sense from a logical point of view, but it may make some sense for the REPL, which can choose not print the "undefined" value.
So, if you interpret "undefined" as "nothing", there is a rationale why it is different to null in JavaScript.  But without any such interpretation, it is meaningless.

That '() and #f really mean different things is proven by the existence of many list processing procedures where it is natural to distinguish between the empty list result or a false result.