```Subject: Re: Legal ANSI comment form?
From: Erik Naggum <erik@naggum.no>
Date: 28 Aug 2002 01:14:05 +0000
Newsgroups: comp.lang.lisp
Message-ID: <3239486045052686@naggum.no>

* Frode Vatvedt Fjeld
| My concern is not to push CL towards being less readable than need be.

Curiously, we have exactly the same concern.  How about that?

| > [...] How do you /read/ #+feature?  What does it /mean/ to you?
|
| Well, how do you read out #+(or), exactly?

I would appreciate an answer to my questions before you fire off
counter-questions as if this was some soft of political campaign.

Since I understand how (or) works in general in Common Lisp, I find no
problems reading it exactly as it stands: First, use the following expression
when the  feature expression holds.  And second, which is never.  Likewise,
(and) always holds.  I also think of (or x y z) as the first true value of
x, y, and z and (and x y z) as the first false value of x, y, and z.
Contrary to how many people read (< a b c), I read it as testing whether a,
b, and c, in that order are monotonically increasing values.  This is
probably because I read < as "less than", I think of the symbol as
indicating growth and increasing values.  Color me different, but long ago,
I discovered that so much of mathematics became unduly hard to grasp if I
thought in terms of operations because the operations quickly became so
involved that I could not efficiently abstract them away.  It is crucial that
one understands the relationships involved rather than the operations to
compute the values if you want to think mathematically.  I see that people
who think in terms of operations never get very far in mathematics, too, so
I think that I got this part right.

numbers together".  The identity under veljunction

| My position is to mold that language into being as useful and readable as
| possible, rather than holding it to some measure of conceptual purity.

If there is a conflict here, you have the conceptual purity wrong.  I think
you are massively overreacting to something quite different from what is at
hand, and I cannot determine what it is.

| No, #+false or even #+ignore would mean "process the following when the
| feature that is guaranteed not to be present in any system is present".

for any individual /feature/ -- you had to create it ex nihilo.  (or) and
(and), however, embody that guarantee for /feature expressions/.

| Which in the current language can be spelled out as #+(and x (not x)), which
| can be reduced to #+(or).

It is nothing if not silly first to invent a convoluted form that does not
follow from the premisese and then to reduce it (I am not aware of any rules
that make such a reduction possible, though), but I guess this is how you
elegance of (or) and (and).  I see this as evidence of a lack of appreciation
of the identity values of operators, which seems incongrous and unexpected.
I hope you will find the time to stop warring so stupidly and tell me /why/
you invented this convoluted form and how you reduced it, in that order.

| By endorsing #+(or) you have clearly already made that leap from "does this
| feature exist?" to "evaluate this expression", so I really can't understand

I have seldom seen less honest argumentation and equivocation in this
newsgroup.  I have not made any leap like you say I have.  Damn you for
imputing such stupidity to me!  Just because you see things differently does
not give you any right you make any claims about what other people think.
Do you understand this much?  What /is/ it you fight against?  What possible
motivation could you have to engage in such dirty politics just because you
do not "like" an existing facility and have to invent your own bogosity?

The language of feature expressions does not involve evaluation.  Why do you
fail to understand this?  Read the goddamn specification and understand what
it says!  I believe, however, that the reason you are so confused is that you
think a stupid stunt like #+ignore is a good idea and want to abuse the
feature expression for some pseudo-natural language processing without
actually understanding the mechanism.  You have argued for a position that
is inconsistent with the actual mechanism and argue against positions that
are consistent with the existing facility.  Since you /invent/ "guarantees"
that some some features will never be in a system without any support for
such a claim from the specification, you are clearly not talking about the
existing feature framework, but one where feature names "mean" something to
the reader when interpreted in the reader's natural language, and you ignore
the counter-argument that the vocabulary that programmers would draw from to
invent "meaningful" symbols is unbounded, whereas the vocabulary of the
feature expression language is well-defined.  Then you have the gall to
claim that others /also/ "evaluate" the feature expression and the dishonesty
to put words in other people's mouth.  This is clearly not a "fight" you
think you can win with argumentation alone, but have to fight it out with
idiotic politics and blame-throwing and imputing positions to your
opposition that are far easier to refute than their real positions.  I really
had thought higher of you than to expect this kind of shit from you.

| I never branded anyone's understanding of this, I was describing the
| language/notation that would result from adopting #+(or) etc.

It seems that you make up your mind about consequences without thinking very
carefully about any arguments.  I think you just "feel" that #+(or) is ugly
and react emotionally and unintelligently to it without examining your own
reaction.  Then you blame other people for your emotional reaction and
impute all sorts of evil intentions to them when nonesuch exist.  Right?

That you think something bad will result from A does not mean that A is bad
/until/ it is taken to the extreme, which normal people would not do.  To
argue that nothing but the extreme position exists is unbelievably stupid
and counterproductive.  The fact that /you/ think  "false" will never be on
the feature list and others "ignore" and "never" and perhaps "nil" or whatnot,
however, is cause for concern.  There is no natural restriction on the words
that some people will think are never going to be used.  People use symbols
in their natural language, for crying out loud.

| What do you expect the average new CL student's reaction to seeing #+(or)
| will be?

I expect the average new Common Lisp programmer to want to know what it
means before they react emotionally like some retarded and judgmental idiot
who cannot deal with the unexpected and goes screaming to mom when he has to
learn to deal with it.  I expect the Common Lisp programmer to be smarter
than your average person who judges based on his first impression and never

| Fuzzy pre-understanding is a good thing, it's how everyone gets by in a
| complex world.

Sure.  But programming is about real and fundamental understanding of the
world you want to control and not just to "get by in a complex world".  That
is, unless you want to communicate lack of precision and fuzzy thinking. in
which case a lot of stupid things that people do should be tolerated by their
programming languages.  I do not want to go down that road.  I see that you
have taken that road and are quite happy with it.  Then you have the gall to
argue that you fight against useless conceptual purity.  How fucking annoying!

| Also in a learning situation it is almost essential, so long as the implied
| understanding follows the pre-understanding.

It is crucial that when it does not, the student /thinks/ and does not leap to
judgment based on his pre-understanding emotional responses.  More often
than not, the pre-understanding is faulty and needs to be updated or even
entirely replaced.  If you are the kind of person who judges first and never
thinks, I should think programming computers would be so frustrating that at
the very least you never learn Lisp.

| But I believe in this case the issue is rather about something quite
| different, which might be called fuzzy post-understanding.  After you have
| understood something, you keep it in your active vocabulary by means of any
| number of fuzzy mechanisms.  I believe that's how the brain works.

I do not disagree with this in general terms, but I find it stunning that you
use this theory to reject something you have clearly not thought much about.
The use of a feature with a name that one hopes should communicate to other
programmers a desire not use it is remarkably naïve.  This would be a good
argument in favor of obscenities instead of normal words.

It is only when you do not understand how feature expressions work that you
can dream up something like #+ignore.  If you understood how they work, you
would not even /want/ to invent this, but would already understand that you
could achieve it with #+(or) and #-(and).  But if you first see #+ignore, it
takes really good understanding to realize that what people mean is #+(or).
(Thanks again, Paul.)  #+ignore spreads because it looks OK to the untrained
eye, not because it is linguistically sound.  The idea that one should test
for named features are not even going to be present is severely misguided.

| In my view, #+(or) provides very little for that part of the brain to work
| with (in other words, it's unreadable), and for no particularly good reason
| other than some sense of purity that, in this smallest and simplest of
| languages, is worthless.

So much hostile judgment and so little desire to understand.  I could cry.

You have not even tried to understand it, but have made up your mind long
before understanding set in that this was unreadable.  Such judgmentality is
really harmful to achieving useful results.  Mind you, I paid no attention
to #+ignore /until/ I thought about it and saw how awfully stupid it is.  It
was "acceptable" because I ignored the actual meaning of #+ and looked only
at the word "ignore".  This also caused me to misread feature expressions
that were more complex.  From what you say, you do exactly the same thing:
You do not find #+(or) readable /because/ you think #+ignore should make
sense when it clearly does not.

| > If you wish to make sense and convince people that such abuse should be
| > tolerated, you have to remove some of the emotional ties you have
| > presented and back it up with reasoning.  OK?
|
| I tried to, actually.

You failed.  You keep imputing positions to me that I do not hold.  I hate
it when people lack the willingness or perhaps ability to distinguish between
what they think somebody must have meant and what they have heard them say.

Like, today, I learned that Fraunhofer and Thomson want a small license fee
(USD 0.75) for every MP3 decoder (not just the encoders), and a rather naïve
friend of mine was outraged -- not because of the facts he had just learned
himself, but because he had to invent some motives for this when he could
not accept the position emotionally, and then got all worked about these

| The short form is this: 1. We need a form-scope comment syntax.

"Form-scope"?

| 2. Since we already have #+ and #-, using up another macro character might
| not be worth it (but if the distaste for "intellectual sloppiness" is very
| strong, it might be worth it after all, #; could be a strong candidate).

I think #; is a really good idea.  I have embellished it with a count, using
the usual infix numeric argument to skip that many expressions.  This may
not be a great idea, but it is conveniently available.

(defun sharp-semi (stream char count)
(declare (ignore char))
(dotimes (exprs (or count 1))
(values))

(set-dispatch-macro-character #\# #\; 'sharp-semi)

This means we can write (foo bar #;zot quux) and (foo bar #2;zot quux)
instead of (foo bar #|zot|# quux) and (foo bar #|zot quux|#), respectively.

| 3. #+(or) etc. is too cryptic.

This is an emotional argument from ignorance, and consequently has no weight
at all.

| 4. So define a less cryptic false feature expression.

So learn the language and use it to its fullest extent.  Shying away from
certain forms of expression because some people might not get it, is wrong.