Subject: Re: Lisp advocacy misadventures
From: Erik Naggum <>
Date: 25 Oct 2002 22:56:26 +0000
Newsgroups: comp.lang.lisp
Message-ID: <>

* Tim Daly, Jr.
| I was talking with a friend of mine about Lisp.  He said that people
| write things in C because of speed.

  But this is incorrect.  People use C because it /feels/ faster.  Like, if
  you build a catapult strong enough that it can hurl a bathtub with
  someone crouching inside it from London to New York, it will feel /very/
  fast both on take-off and landing, and probably durng the ride, too,
  while a comfortable seat in business class on a transatlantic airliner
  would probably take less time (except for getting to and from the actual
  plane, of course, what with all the "security"¹) but you would not /feel/
  the speed nearly as much.


| I said that Lisp will not necessarily cause a program to be slow, and in
| fact, because it lets you write a better program, things may even get
| much faster.  He said 'like what?'
| Hmm.

  Better algorithms and type systems are well known to produce better
  performance by people who actually study these things.  It is often very
  hard to implement better algorithms correctly and efficiently in C
  because of the type poverty of that language.  Yes, you get to tinker
  with the bits as fast as the machine can possibly tinker, but, and this
  is the catch, you get to tinker with the bits.  If you are not super smart
  and exceptionally experienced, the compiler will produce code that is
  faster than yours.  If this holds from assembly to C, it holds from C to
  Common Lisp, given that you want to do exactly the same thing.

  The core problem is that C programmers think they can get away with doing
  much less than the Common Lisp programmer causes the computer to do.  But
  this is actually wrong.  Getting C programmers to understand that they
  cause the computer to do less than minimum is intractable.  They would
  not /use/ C if they understood this point, so if you actually cause them
  to understand it in the course of a discussion, you will only make them
  miserable and hate their lives.  People are pretty good at detecting that
  this is a likely outcome of thinking, and it takes conscious effort to
  brace yourself and get through such experiences.  Most people are not
  willing even to /listen/ to arguments or information that could threaten
  their comfortable view of their own existence, much less think about it,
  so when you cannot answer a C programmer's "arguments" that his way of
  life is just great the way it is, it is a pretty good sign that you let
  him set the agenda once he realized that his way of life was under threat.
  Since you have nothing to defend, your self-preservation instinct will
  not activate hitherto unused parts of your brain to come up with reasons
  and rationalizations for what you have done, you will not be aware that
  you have been taken for a ride before it is over and you "lost".

  If you deny people the opportunity to defend something they feel is under
  threat, however, some people go completely insane with rage and actually
  believe that you threaten them on purpose and that you willfully seek to
  destroy something very valuable to them.  However, some of the time, you
  meet people who /think/ and who are able to deal with threats in a calm
  and rational way because they realize that the threat is all in their head
  and it will not go away just because they can play word games with people
  and stick their head in the sand.  If it /is/ the threat they feel it is,
  they realize they had better pay some real attention to it instead of
  fighting off the messenger so they can feel good about themselves again.

  Much of the New Jersey approach is about getting away with less than is
  necessary to get the /complete/ job done.  E.g., perl, is all about doing
  as little as possible that can approximate the full solution, sort of the
  entertainment industry's special effects and make-believe works, which
  for all practical purposes /is/ the real thing.  Regular expressions is a
  pretty good approximation to actually parsing the implicit language of
  the input, too, but the rub with all these 90% solutions is that you have
  /no/ idea when they return the wrong value because the approximation
  destroys any ability to determine correctness.  Most of the time, however,
  the error is large enough to cause a crash of some sort, but there is no
  way to do transactions, either, so a crash usually causes a debugging and
  rescue session to recover the state prior to the crash.  This is deemed
  acceptable in the New Jersery approach.  The reason they think this also
  /should/ be acceptable is that they believe that getting it exactly right
  is more expensive than fixing things after crashes.  Therefore, the whole
  language must be optimized for getting the first approximations run fast.

  See how elegantly this forms a completely circular argument?  But if you
  try to expose this circularity, you necessarily threaten the stabiliity
  of the whole house of cards and will therefore be met with incredible
  hostility and downright hatred, and you will not even hear about the
  worst fits of insane rage until years later when some moron thinks he can
  get back at you for "hurting" him only because his puny brain could not
  handle the information he got at the time.

| Well, I'm blinded by the very misconceptions that led me to this point,
| and I'm not sure what to tell him.  Can you help me out?

  Ask him why he thinks he should be able to get away with unsafe code,
  core dumps, viruses, buffer overruns, undetected errors, etc, just because
  he wants "speed".

Erik Naggum, Oslo, Norway

Act from reason, and failure makes you rethink and study harder.
Act from faith, and failure makes you blame someone and push harder.