Subject: Re: garnet performance issues
From: Erik Naggum <erik@naggum.no>
Date: 1999/02/12
Newsgroups: comp.lang.lisp
Message-ID: <3127786748613194@naggum.no>

* Duane Rettig <duane@franz.com>
| Your arguments sound like the "quality" argument (i.e. "do it right the
| first time", "strive for zero defects"), and mine comes from a slightly
| different point of view that I believe is in harmony with the quality
| perspective, but which is often forgotten.

  um, no, not quite.  I don't believe it is possible to do anything right
  until you know what "right" means, and this is not available a priori,
  despite what all the lectures in programming methodology and systems
  design would have poor students believe.  (if it was, we would have had a
  perfect society, too -- law is nothing more than programming society, so
  if it were possible to plan for everything, it would have been worked out
  many millenia ago.  I'm amazed that people think large-scale experiments
  in top-down design like dictatorships and planned economies would fare
  any better in any other endeavor involving creative people.)  therefore,
  you have to go through various iterations, and your design must be
  prepared for this, but that is not to say that you should _not_ expect
  success, or actively prohibit it.

  the issue I want to raise is whether people should approach the first
  version with "let's try to get this right" vs "we're going to throw this
  one out, so let's not waste too much resources on it".  the whole point
  with wasting the first version is to waste _only_ the first version, at
  worst.  people who set out to do something that will be a waste in the
  eyes of managers who cannot understand why their employees do not know
  enough to get it right the first time, will _not_ learn all there is to
  learn and will _not_ ensure that only the first version gets wasted, but
  will waste version after version.

  now, if they do get he first version right, and don't need to waste it at
  all, that's a very commendable result, but if you only set out to
  "prototype", this is no longer an option, either by choice of language or
  tools or something else that everyone knows was intended to be wasted,
  and would feel bad if it were to go into production.  that's what I think
  is wrong about using specific languages to prototype things and talk
  about a language as "good for prototyping".  it is invariably understood
  as "not good for production code".  I have to fight this impression.

  in conclusion, it is hard enough work to it right the second time when
  you have all the data from the first try available if you should not also
  have to do it a third time because you weren't _allowed_ to go all the
  way in the first attempt.

| Another way to put it is "Don't hang on to the first thing you did."

  my argument is that the reverse is equally valid: don't _always_ throw
  out the first thing you did.

| However, I submit that lispers prototype constantly, and that every time
| you type "(defun foo () ...)" to the prompt, you are prototyping
| something that you _intend_ to throw away (otherwise you would have put
| it into a file!).

  ok, that's the gist of your disagreement.  I think intent to throw away
  is the bad part about prototyping.  throwing away is something you should
  be prepared to do without much concern, but you should strive to make it
  usable, too.  if you intend to throw it away, the threshold to keep it is
  set too high.  now, what if it actually does work the way you wanted it
  to?  do you still throw it away?  if you prototype in Lisp and write
  production code in C, you would have to, right?  that's bad.  not only
  because doing something you have already done over again is stifling, the
  problems in C are different from the problems in Lisp and you have to do
  a lot of redundant work. 

| The world is full of programmers stuck on their own dinosaurs because
| they won't let go.

  well, letting go is as much a part of caring about something as not
  letting go.  sadly, our cultures are built by people who couldn't let go,
  so everything around us is shaped by an unhealthy conservatism, and
  because people are afraid of having to suffer as much pain every time
  they manage to get something to work somewhat right, we have a lot of
  stuff that were shaped by a trial-and-error process that stopped as soon
  as the errors were tolerably small.  had people been able to let go, and
  use their knowledge from what failed, perhaps we could move on.  in my
  view, the patent system was supposed to let inventors capitalize on the
  works of others and make _significant_ improvements, while the patent
  effectively forbade the meager insignificant improvements.  a wonderful
  mechanism that should be applauded and made much stronger, it has turned
  into a monster that stifles innovation by admitting overly broad and
  senseless patents, instead because the wrong kind of people approve
  patents.  this is just another example of how society is shaped by
  conservatives who have no clue _what_ they are conserving, only that
  change is bad.
  
| In which case you have already rewritten (at least parts of) your
| application several times over.  But to be free to do this rewriting, you
| have to have made a correct estimate of how long the project will take.
| If you factor in the prototyping (call it "learning curve", or "concept
| development") time, you will have the time to do this.

  ironically, you always have time to do that, despite what everyone says.
  the key is to get something to the point where it does not fail, and then
  you can change everything as long as it continues not to fail.  people
  always have time to improve things, and they embrace things get improved
  in oh so minor ways.

| But if they hear bad news from someone they know can and has done the job
| on time in the past, they tend to be much happier than when they hear
| good news from someone that they know they are going to have to plan
| "slop" time for.  And for those programmers who gain the reputation of
| getting work done on time, the subject of prototyping should rarely even
| come up.

  well, there's often a "fuzz factor" you can add to or multiply with what
  people say.  a good manager learns the fuzz factor and doesn't want to
  change it.  some programmers get all the big work done in the first few
  days and then spends a lot of time fixing minor bugs, while others tinker
  with fundamentals 95% of the time and whip up a marvelous piece of art in
  the last few days.  the key to managing is to understand what you deal
  with, and programmers are predictable _one_ at a time, even when they are
  (supposedly) part of a team.

  incidentally, getting something to work is child's play.  where you need
  the expertise is in not making it fail, or fail gracefully.  and I don't
  think a prototype that only shows how it works is useful at all, since
  you learn nothing about surviving the failure modes by having something
  that only "works".

  anyway, I like Common Lisp because I can write enough serious code in it
  to watch how it actually deals with a hostile world (specifically, socket
  code that distributes functionality across hundreds of computers over
  thousands of little stretches of cable and routers that people upgrade
  and add bogus routes to and all kinds of shit) and then rewrite it before
  the hostilities resulted in harming anything.  also, Allegro CL is the
  first environment I have used which allows me to use all I know and learn
  about the hostile world without writing tons and tons of code.  when
  something causes a crash only once a year, you ignore it in C because it
  might change your whole design and basically send you back to the drawing
  board for a year to deal with it, and then you suffer a crash once a
  year, so it isn't worth the costs.  it might take a week or a month to
  redesign to survive such a problem in Common Lisp, but you can actually
  offord to take care of it.  this means that the next time you have to
  deal with this kind of problem, the competent Common Lisp programmer gets
  it right, while the competent C programmer still doesn't know what it
  would take to solve that particular problem because he hasn't been there.

#:Erik
-- 
  Y2K conversion simplified: Januark, Februark, March, April, Mak, June,
  Julk, August, September, October, November, December.