From: Steve Haflich

Subject: Re: Bug in macrolet?

Date: 1999-7-6 23:21

   From: Jeff Dalton <aiai.ed.ac.uk at jeff>

   Subject: Re: Bug in macrolet? 
   To: Steve Haflich <franz.com at smh>, Antonio Leitao <gia.ist.utl.pt at aml>
   Cc: <cs.berkeley.edu at allegro-cl>
   
   Steve Haflich <franz.com at smh> wrote:
   
   I would certainly *expect* an error when both running interpreted
   and when the code is compiled.

This "expect" concept isn't appropriate when analyzing what a standard
says.  It is appropriate only when drafting such a standard, and when
considering whether you like the standard.  I certainly agree that the
interpreter behavior violates a CL macro programmer's notion about the
"whenness" of variables, and it would probably be better in this case
if some sort of error were signalled (because the code isn't portable
and is probably massively confused) but I'll argue with some examples
below that it is very hard to get a herd of CL programmers to agree on
the details.
   
   So ... here is a relevant passage (from the "Special Operator FLET,
   LABELS, MACROLET" entry in the Hyperspec):
   
      The macro-expansion functions defined by macrolet are defined in
      the lexical environment in which the macrolet form appears.
      Declarations and macrolet and symbol-macrolet definitions affect
      the local macro definitions in a macrolet, but the consequences are
      undefined if the local macro definitions reference any local
      variable or function bindings that are visible in that lexical
      environment.
   
   The remaining question is how does Allegro manage to get the behaviour
   it does, since it is (I'd say) rather unexpected?  And why?

Neil explains it correctly below.

   From: Neil Goldman <isi.edu at goldman>
   
   It doesn't seem all that hard to come up with this behavior.  The ANS seems
   to say that the body of a lexical macro (macrolet) definitely has access to
   PART of the lexical environment in which it is defined (declarations and
   other lexical macros) but NEED NOT have access to lexical variables (which
   of course would make no sense).  If the interpreter simply makes the ENTIRE
   lexical environment available, you would get the behavior being described.  
   
   What is surprising about the ANS is that it does not require an error to be
   signalled when a lexical variable is referenced.  I suspect (but with no
   explicit evidence) that they simply intended to allow (but not require) an
   implementation to let a reference to a variable X to be interpreted as a
   reference to a global definition of X (e.g., a defconstant) in this
   context, even if a lexical binding of X intervened, not that they intended
   to allow an implementation to actually make a lexical binding of X visible
   in interpreted code.  But that is pure speculation on my part.
   
>>And why?
that is certainly a good question -- I don't think anyone would WANT the behavior you have uncovered, even if the ANS allows it. I've already agreed. Now let's see if we can deal with the details. The ANS says that the results are undefined if reference is made to a lexically-defined function of variable. (The ANS elsewhere gives reference to block names the same status.) But it does _not_ say that the macrolet body does not see surrounding definitions. Indeed, the implication of the text is that it _does_ see them. Here is an example in silly but conforming code: (defmacro with-breakpoint (func &rest subforms &environment e) (let ((form (macroexpand `(,func <subforms) at ,> e))) `(progn (when *breakpoint* (break "Breakpoint: ~s" ',form)) ,form))) (defun foo (a) (macrolet ((bar (z) (* z z))) (flet ((bar (x) (1+ (bar x)))) (with-breakpoint (bar a))))) It is very clear that when the compiler compiles the innermost call to bar, either the flet bar must be visible in the lexical environment in order to shadow the macrolet, or else the environment at that point must somehow have the macrolet removed from it. To me it seems implementationally simpler just to leave the flet in the environment. In summary, and IMO, the violation of expectations comes about because CL is not congruent in these two things. CL more or less defines the _defined_ semantics in such a way that macroexpand time is distinct from execution time, but the lexical name shadowing of both function and variable namespaces merges both macroexpand-time names and execution-time names. This is fundamentally illogical, but it derives from the history of Lisp as an interpreted language where these two times were not deparate -- fsubrs and all that.