Subject: Re: Reply to Ousterhout's reply (was Re: Ousterhout and Tcl ...)
From: Erik Naggum <>
Date: 1997/04/18
Newsgroups: comp.lang.scheme,comp.lang.scheme.scsh,comp.lang.lisp,comp.lang.tcl,comp.lang.functional,comp.lang.c++,comp.lang.perl.misc,comp.lang.python,comp.lang.eiffel
Message-ID: <>

* Charles Lin
| Erik Naggum ( wrote:
| || * Charles Lin
| || | If one had to choose a single type for everything, a string is a pretty
| || | good choice.  Why not a number?  How would you represent a string with a
| || | number?
| || excuse me?  what you call a "string" already _is_ a number.  computers
| || don't have characters.  display devices and printers do.
| || one of the first lessons of computer science _should_ have been internal
| || and external representation are so wildly different concepts that they
| || cannot even be confused, except, of course, by people who are fully aware
| || of the difference, but proceed to attempt to obliterate it.
|      Egads.  You must have a slow newsserver.  You're at least the third
| person to mention, plus there have been follow-ups to that, and
| follow-ups to the follow-ups.  To summarize.  Yes, you can use numbers to
| encode anything.  Computers use binary representations for everything.
| Second, no one would want to use numbers to program in.  The point was
| given that a programming language has one type, what would be the most
| convenient type to use.  If your only type was integers, then strings
| would be a complete pain.  Just write a program that just prints "Hello,
| world".  I don't think you want to look up the ASCII representation just
| to force the use of numbers as types.

I'm sorry you didn't get the point simply because you think I have a slow
newsfeed.  at issue is that you must be the only person here who thinks
that "everything is a number" is equivalent to "everything is a decimal
digit string".  it isn't.  it never was.  the point is that your "string"
is merely the expression of a number in base 256, and the digits look like
characters because that is what happens when you throw those base-256
digits at your display device.  (adjust for other bases as appropriate.)

my second point also escapes you completely.  the internal representation
of a number is _not_ a digit string.  the internal representation of an
character is _not_ "the ASCII representation" (by which one must assume you
mean "string of decimal digits").  the idea is that there is a difference,
a very important difference, in fact, between internal and external
representation.  I get the impression that you don't understand this
difference.  whether a sequence of bits or bytes is given the external
representation of a decimal digit string, a string, an image on a bitmap
display, etc, is completely irrelevant to that sequence.

strings are excellent for external representation, and that's where they
should be used.  internal representation should be efficient for the
machine.  when and if you need to read or print a value, use the external
representation, but as long as you work with it internally, use something
more efficient.  where you define yourself relative to "external" is of
course an open issue.

from what I read here, it seems that even John Ousterhout has seen this
light in Tcl 8.0.

btw, only the Lisp family has a consistent interface between internal and
external representation, with the read/write functions.  when non-Lispers
try to reinvent this wheel, they get it completely wrong, such as Tcl.
some don't even understand the difference between internal and external
representation in the first place, so are precluded from appreciating this
lesson from the Lisp family.

I'm no longer young enough to know everything.