Subject: Re: Storing macro dependencies in the in-memory function object (was Re: When to recompile/reeval?)
From: Erik Naggum <>
Date: 20 Oct 2002 13:51:01 +0000
Newsgroups: comp.lang.lisp
Message-ID: <>

* Tim Bradshaw
| Lisp applications which take a long time to recompile should, I guess,
| either be doing something very complex in the way of really hairy
| macros, say, or be huge.

  Just to clarify: More than 100 ms is a long time if you have to do it in
  order to test-run a macro.  That is sufficient to interrupt your flow of
  thinking and working.  When I develop with Allegro CL and Emacs, I tell
  it not to compile with (setq fi:lisp-evals-always-compile nil) on the
  Emacs side, and I happily go about M-C-x'ing definitions, run stuff in
  the background, and test code.  Code that matures is saved, compiled, and
  loaded, but code in development generally runs interpreted.  I really
  like that Allegro CL makes this so streamlined and painless.

| My system of ~22,000 lines takes 13 seconds to build from cold, or ~26
| seconds to build + dump an image.

  And I think Unix `ls´ takes a long time to run on directories with 1000
  files so Emacs `dired´ is something other than instantaneous.

| Sometimes if I change a major macro I just blow everything away and
| rebuild, because I know my system declarations have bugs.

  Precisely.  And you cannot do this every time you change a macro.

  I have always developed mini-languages, even when I worked mainly in C,
  and macros give me the opportunity to reprogram things and have code that
  reasons about the code I write.  Macros do a lot of the work in my code.
  When I want to change something important, it is always in the macros,
  and minor changes can have major effects.  Neat protocol descriptions are
  usually reworked by macros to produce reems of code that is extremely
  hard to follow, but very efficient.  Perhaps I "overuse" macros, but the
  ability to run interpreted correctly is essential.  If macros were harder
  to use, or function redefinition did not work, I would not be able to
  write this kind of code at all.  What I would have done then is hard to
  tell, but that I would have ended up with less intelligent and messier
  code is virtually guaranteed.

  E.g., I have been working on and off on a regular expression machine that
  does as much as possible at compile-time, such as maintain multiple paths
  through the graph in parallel (breadth-first) instead of backtracking and
  wasting lots of time if the order of alternatives is not optimal.  This
  came up when I sat down to reimplement the Common Lisp reader: if I could
  compute the integer value while reading a potential number digit by digit
  and make other decisions along the way, I could exercise the cache more
  fully and read only once from uncached memory.  Writing code that does
  this manually is very, very hard, especially if you want to get it right.
  Writing macros that arrange multiple execution paths in parallel is not
  easy, but at least it is much easier than to write what they produce.

| It annoys me that no such tool exists though, especially as one of the
| things the system above does (or did) was to construct these kinds of
| dependency relationships for C++/CORBA systems automatically...

  This may be related to the curse of macros: You cannot in general expect
  to understand fully what some code would compile to without being the
  compiler.  In inferior languages, the code you write is probably the code
  the machine will run.  In Common Lisp with lots of macros, the code you
  write is only data for lots of other programs before it becomes code for
  the compiler to arrange for the machine to run.  It takes some getting
  used to.

Erik Naggum, Oslo, Norway

Act from reason, and failure makes you rethink and study harder.
Act from faith, and failure makes you blame someone and push harder.