Subject: Re: garnet performance issues
From: Erik Naggum <>
Date: 1999/02/14
Newsgroups: comp.lang.lisp
Message-ID: <>

* Erik Naggum <> writes:
| the whole parameterized type nonsense is a testament to bad language
| design.  it is completely devoid of meaning.

* Juliusz Chroboczek <>
| I beg to disagree.

  um, what part of it do you disagree with?

  in case it hasn't been clear, I have been trying to show that Common Lisp
  _has_ what C++ calls parameterized types, and has had it since way before
  C++ was conceived.  the overall context here is "why reject Common Lisp?"
  and one argument was: "it doesn't have parameterized types", where
  "parameterized type" is a language-specific feature in C++, much hyped
  because the language lacks everything that would make this a _natural_
  thing in C++.  the whole idea of adding parameterized types to a language
  that doesn't have a useful way of representing type information at
  run-time is just silly.

  so, I'm not arguing against _types_.  criminy.  I'm not arguing against
  the ability to reason about code.  how could I be a strong favor of type
  inference and author of code analysis tools if I were?  I'm not arguing
  about the usefulness of type calculus.  geez, my University education was
  all about static typing and verification and all that stuff that has yet
  to prove itself because it's just too hard on programmers to tell these
  things all they need to know.

  I'm arguing against the stupid disdain for dynamic typing and run-time
  type information (in C++ terms) except as an inefficient last resort when
  it's too hard to do it statically.

| Whether type declarations are compulsory in the code, optional, or not
| written at all, and whether they are statically or dynamically checked
| is more or less irrelevant to this discussion.

  I think they are irrelevant only to your own discussion, which doesn't
  address anything people here have actually argued against.  my argument
  is that you cannot both wish to retain type information _and_ drop it.
  if your language can't speak about its types, any benefit of this type
  business is in efficiency in the compiler.  if you want to reason about
  types, you just can't do it in C++, and it is entirely irrelevant to any
  abstract meaning of "parameterized type" -- C++ does not instantiate it
  in a useful way, while Common Lisp does.

| In this view, being able to express notions such as `list of integers' is
| a perfectly natural request.

  there is a point at which you have to ask of any theory what it will
  change for the better if people adopted it.  theories that are very nice
  and clean and powerful which change nothing in the ways people think or
  act in practice are ineffectual in the extreme.  I want theories that are
  so powerful that people change their ways _completely_, because what
  people do today is mostly stupid, especially in the language design area.

  do we have the concept of a "list of integers" in Common Lisp?  yes.  is
  it different from any other list, operationally and notationally?  no.
  did C++ have the concept of a "list of integers"?  _no_.  it then chose
  to add a list implementation that _had_ to be specialized on the type.
  of course, this is a pain.  so naturally the non-solution "parameterized
  types" had to be invented.  this is not because of any lack of usefulness
  of typeful languages, not because people do not want "list of integers",
  but because C++ has no concept of a _type_hierarchy_ that would allow
  "list of any-type" to be expressed usefully.  all C++ can do is "list of
  some-type", and since its types do not form a hierarchy, but are disjoint
  (even the classes, there is no "universal superclass") so instantiation
  of the type becomes a _necessity_.  the concept of "parameterized types"
  were born out of this necessity.

  now, in a type calculus, you don't talk about types as any special kind
  of parameter, it's your _favorite_ kind of parameter.  you compute types
  and of course it takes types as arguments.  making up "parameterized
  type" as a phrase means you _don't_ usually compute or specify types
  abstractly.  in other words, you need the terminology because it is
  something people need to be aware that they can do, and there is no space
  for it in their existing vocabulary.

| A number of people ... are even working on calculi in which you have the
| full power of the lambda-calculus at the level of types -- very natural
| if you've been brought up on Lisp, and like to have full access to the
| language at compile-time.

  well, here's my biggest gripe with the explicit-typing crowd: they have a
  notion of a dichotomy between run-time and compile-time, and it's a moral
  dichotomy: "what's good at compile-time is bad at runtime, and what's
  good at run-time is bad at compile-time".  macros in Lisp aren't
  different from any other function, they are just called under slightly
  different conditions.  the _same_ language is used.  why did they need a
  _new_ language to get a lambda-calculus for typs?  well, I think it's
  because the language in question is so badly designed that they had to.

  in all the literature I have read about types, and it's considerable,
  this notion is hidden under a lot of consequences of holding it, but
  never made explicit.  I think the reason it is not explicit is that it is
  genuinely stupid, and like so many other things in our cultures and
  traditions that are stupid, smart people find smart ways around them, and
  it's fundamentally risky to stand up and say "this is stupid" without
  having a much better idea, because millions of people will flock to the
  defense of the stpuid old traditions without even considering the other
  idea unless it's so intuitively superior that people go "ah, of course",
  instead.  so real progress happens only when some genius is willing to
  ignore the stupid ways and do something that opens up entirely new paths.
  in this case, inventing parameterized types hss done nothing to help us
  get out of the stupid ways.  if people were somewhat less insistent on
  obeying the stupid ways of their ancestors, maybe we also could have more
  respect for their _smart_ ways.  but I digress.  (and don't get me
  started on foreign aid to countries where a family's prestige is
  determined by how many male children they have.)

| You may then express types such as `vector of int of length 42'

  (vector integer 42)

| or `cons is for any n:int a function from vectors of length n to vectors
| of length n+1'.  (I don't think the latter can be expressed in Common
| Lisp, not even with SATISFIES.)

  I assume some function like ADJUST-ARRAY was intended, not CONS, since
  CONS doesn't do that and it's good that it cannot be expressed in Common
  Lisp.  (why do people who want to make theoretical points so often miss
  the details in ways that shows that they have genuine disdain for them?)

  recently, an implementation of core concepts in programming by contract
  was posted here, in Common Lisp.  you can certainly express the above in
  such a system.  (I know the argument to follow: "but does it help the
  compiler produce better code?" -- more on that just below.)

| (No, those calculi are not ready for general consumption by programmers,
| although they've been remarkably successful in theorem proving.)

  I think you underestimate practitioners, but hey, that's the norm when
  people invent theories that have _zero_ effect on the way they work.

  here's my view on this type business: it makes no difference whatsoever
  whether you compute types at run-time or at compile-time, you can reason
  about the code just the same.  (people who scream and shout that they
  can't, have yet to discover the simple concept of a type hierarchy with
  embracive, abstract types like NUMBER instead of "integer of 16 bits"
  being disjoint from "integer of 32 bits")  it makes no difference to
  anything except performance whether the system has to figure out the type
  of an object at run-time or at compile-time, and since we have a type
  hierarchy for everything, the way to move something into the compile-time
  arena is either by type inference or hints from the programmer.  and if
  the programmer wants to ensure that he does not see type mismatches, he
  can either write code for it or declarations for it, which are identical
  in conceptual terms.  (in particular, if you check for a type, you know
  that whatever was checked has that type afterwards, just like you would
  if you had a declaration -- so why do the strong-typing crowd get so anal
  about this?)

  despite all the hype and noise about parameterized types in C++, you
  still can't use it any useful way at run-time.  all you can hope for is
  that the compiler did all the useful things by the time the code was cast
  in stone, and hopefully didn't make any mistakes (which the C++ template
  cruft has been suffering from in all implementations for years).  in my
  view, this restriction on type calculus is phenomenally stupid, and
  anyone who works with strong typing after accepting the dichotomy between
  run-time and compile-time is unlikely to contribute anything of serious
  value, simply because the mind-set is too narrow.  in particular, if it
  is necessary to study how something behaves in a run-time situation in
  order to come up with a breakthrough idea, the person who believes in
  compile-time will have to invent his own _language_ and write a compiler
  for it, which is just so much work that no sane person would want to do
  it just to prove a simple point, and if someone did it anyway, you would
  get a new language with a number of features that couldn't be used
  anywhere else, and nothing would come of it until C++ incorporated a
  braindamaged version of it so people could argue against Common Lisp not
  adopting "modern trends in object-orientedness" or whatever their
  argument was.

  in brief, the strong typing community are solving the wrong problems the
  wrong ways.  the results of their work increasingly depend on accepting
  that certain operations are _bad_ at run-time, and they provide the world
  with their results in the shape of immature, specialized languages that
  cater to their own aesthetics and very few other people's actual needs.

  strong-typing language researchers prove their points with languages.
  Lisp programmers and theoricians alike prove them with macros.

  so, to repeat myself: I'm not arguing against type theories, not arguing
  against the usefulness of various forms of type inference, not arguing
  against reasonable requests, not arguing against the desire to reason
  about code, not arguing against tools to ensure correctness and all that
  good stuf.  I am arguing against the belief that types belong in the
  compiler and not in the running system, and I am arguing that the
  specific idea of a "parameterized type" is moot in Common Lisp, both
  because we have typed objects and because we have a type hierarchy for
  _all_ types with specialization on disjoint types.  I am arguing that
  parameterized types are _necessary_ when you accept disjoint types and
  believe in the compiler as your only savior.

  I hope this avoids any more arguments as if I was deriding or did not
  understand type theory.  I have a specific gripe with one of its core
  premises, not the random, unlearned hostility to theories as such that
  theoreticians who have a random, unexperienced hostility to practice so
  frequently believe their critics suffer from.