Subject: Re: Big Numbers
From: Erik Naggum <erik@naggum.net>
Date: Sat, 20 Oct 2001 21:18:01 GMT
Newsgroups: comp.lang.lisp
Message-ID: <3212601477682467@naggum.net>

* Dieter Menszner
| The [Ada] Software I had to maintain e.g. produced constaint errors
| whenever the incoming data didn't fit into the datatypes.  Which is not
| completely wrong.  But the exeception handler routines didn't know how
| to deal with the situation and just terminated the program.

  It suddenly dawned on me that this illustrates that the static typing
  people have yet to figure out what input to programs is all about.
  Neither user nor program input is statically typed.  Look at all the
  messy coding you have to engage in to "convert" user input to the
  _expected_ type in most languages, and how the most convenient way of
  dealing with these things is to default to _strings_ that are converted
  on demand to the expected type for the expression in which they are used.
  A major "feature" of several languages is precisely that they achieve
  "dynamic types" through string representation and type conversions.  The
  same goes for that XML abomination, which is barely able to express the
  "structure" of the input, but not any typed values other than "string".
  Several database interfaces also choose "string" as their "dynamic type"
  representation.

  Now, you may be able to control types statically in carefully controlled
  environments.  Source code and compilation in a system environment that
  will never, ever change (it is instead _replaced_ in toto) is perhaps
  such a case.  Everything else, _every_ input source to a program, is a
  source of problems (if you pardon the pun).  This is why it is wrong for
  integers parsed from an input source to be limited in size by hardware.
  This is why it is wrong for a program to make its expectations explicit
  before it knows what it is dealing with.  This is why Common Lisp's read
  is the right interface to deal with user input: It returns objects read
  from the character stream input source, parsed according to their _own_
  inherent syntactic structure.  The standard read function is too weak to
  deal with all it needs to deal with to be usable beyond the expectation
  to receive Common Lisp objects as input, but that is an implementation
  restriction that can be lifted, not an issue with the interface as such.

  How humans deal with the unexpected defines an important part of their
  personality.  Some cannot deal with the unexpected at all, and work very
  hard to force the world into being what they expect it to be.  This can
  go really bad when they try to force people to be what they expect them
  to be, such as by treating people not as people, but as manifestations of
  one's expectations about them.  It appears that few people are able to
  work like read does: Deal with whatever there is they are exposed to.
  Most of the friction on USENET can be condensed to a failure to deal with
  people on their own terms -- as witness, all the moralists who claim that
  others should behave some different way and all those who reject _every_
  notion of being told how to behave, even if they are actually not told.

  The enormous number of security problems found in programs written in C,
  with its amazingly unintelligent fixed buffer size design, shows us that
  even people one would expect (!) to be able to remember that buffer
  overruns is _the_ major cause of problems, still cannot bring themselves
  to write code without using that horribly broken design paradigm because
  they are so heavily influenced and consequently fooled by static typing.

  One might argue that static typing is not necessarily synonymous with
  disjoint hardware-oriented types, but the main reason people want static
  typing is the perceived ability to produce more efficient machine code
  from having the programmer to over-specify type information.  This also
  means that the value of a type specification is lost if it is a general
  type like "integer" instead of a specific number of bits, such as would
  be the case if the language supported arbitrarily wide integers.  Since
  there is so little value to these people in using any type in the type
  hierarchy above the hardware-supported types, this naturally leads to
  preferring _disjoint_ types, indeed _only_ supporting disjoint types.
  This is probably a good idea within a very restricted domain, that in
  which all "possible" values can be known a priori.  A priori knowledge
  has great appeal to some people, but they seem to get flustered when
  facing the unknown, the uncertain, or the unexpected.  This observation
  leads me to believe that those who want static typing want it for a far
  more important reason than ease of compiler design, neat type theories,
  and more efficient code: It presents an easy-to-understand universe with
  very few exceptional situations and where everything is somehow in order.
  Such is not the universe we live in, nor should we attempt to make the
  one we live in like that.

  In order to make input from the external world, which must be treated as
  unknowable a priori, work at all, the only way you can survive is to use
  some form of dynamic typing and types with no upper limit to precision.
  The way Common Lisp does this with its reader is still geared towards an
  expectation that it will receive Common Lisp input, but the core ideas of
  the syntax of Common Lisp are fundamentally _correct_: Objects identify
  their types syntactically, usually with a very low syntactic overhead,
  and the reader returns the appropriate type object after parsing it,
  potentially having dealt with any anomalies and errors in the input.

  I think the reader in the Common Lisp system is extremely under-utilized,
  probably because of the large number of "security problems" related to
  its current design and implementation, such as remembering to turn off
  *read-eval* and the inability to control the behavior of what would be
  interned as symbols.  If it were better understood as the _right_ way to
  deal with user input (as opposed to _expecting_ certain kinds of input),
  it would be used more and would probably also have a much improved, but
  as long as people stick to the statically-typed _model_ of doing user
  input, they will both work too hard on it getting worse results and their
  code will have to be modified ever time they make a small change to the
  syntax or the type system or their expectations.  Notice how the Common
  Lisp _compiler_ is a lot more adaptible than that because it uses read to
  get the source code and objects into the internal form it can deal with.
  The same should _ideally_ apply to any Common Lisp application.

///
-- 
  Norway is now run by a priest from the fundamentalist Christian People's
  Party, the fifth largest party representing one eighth of the electorate.
-- 
  The purpose of computing is insight, not numbers.   -- Richard Hamming