From ... From: Erik Naggum Subject: Re: Reply to Ousterhout's reply (was Re: Ousterhout and Tcl ...) Date: 1997/04/18 Message-ID: <3070347275198364@naggum.no>#1/1 X-Deja-AN: 235675170 References: <5ihaol$n3g@Masala.CC.UH.EDU> <5ilnmp$rg9@engnews2.Eng.Sun.COM> <334EDC93.2376@maths.anu.edu.au> <5it19k$n8k@m1.cs.man.ac.uk> <3353200E.52CD@maths.anu.edu.au> <5j0gq5$jbi@mimsy.cs.umd.edu> <3070209881177993@naggum.no> <5j5o3a$9rn@mimsy.cs.umd.edu> mail-copies-to: never Organization: Naggum Software; +47 2295 0313; http://www.naggum.no Newsgroups: comp.lang.scheme,comp.lang.scheme.scsh,comp.lang.lisp,comp.lang.tcl,comp.lang.functional,comp.lang.c++,comp.lang.perl.misc,comp.lang.python,comp.lang.eiffel * Charles Lin | Erik Naggum (erik@naggum.no) wrote: | || * Charles Lin | || | If one had to choose a single type for everything, a string is a pretty | || | good choice. Why not a number? How would you represent a string with a | || | number? | | || excuse me? what you call a "string" already _is_ a number. computers | || don't have characters. display devices and printers do. | | || one of the first lessons of computer science _should_ have been internal | || and external representation are so wildly different concepts that they | || cannot even be confused, except, of course, by people who are fully aware | || of the difference, but proceed to attempt to obliterate it. | | Egads. You must have a slow newsserver. You're at least the third | person to mention, plus there have been follow-ups to that, and | follow-ups to the follow-ups. To summarize. Yes, you can use numbers to | encode anything. Computers use binary representations for everything. | Second, no one would want to use numbers to program in. The point was | given that a programming language has one type, what would be the most | convenient type to use. If your only type was integers, then strings | would be a complete pain. Just write a program that just prints "Hello, | world". I don't think you want to look up the ASCII representation just | to force the use of numbers as types. I'm sorry you didn't get the point simply because you think I have a slow newsfeed. at issue is that you must be the only person here who thinks that "everything is a number" is equivalent to "everything is a decimal digit string". it isn't. it never was. the point is that your "string" is merely the expression of a number in base 256, and the digits look like characters because that is what happens when you throw those base-256 digits at your display device. (adjust for other bases as appropriate.) my second point also escapes you completely. the internal representation of a number is _not_ a digit string. the internal representation of an character is _not_ "the ASCII representation" (by which one must assume you mean "string of decimal digits"). the idea is that there is a difference, a very important difference, in fact, between internal and external representation. I get the impression that you don't understand this difference. whether a sequence of bits or bytes is given the external representation of a decimal digit string, a string, an image on a bitmap display, etc, is completely irrelevant to that sequence. strings are excellent for external representation, and that's where they should be used. internal representation should be efficient for the machine. when and if you need to read or print a value, use the external representation, but as long as you work with it internally, use something more efficient. where you define yourself relative to "external" is of course an open issue. from what I read here, it seems that even John Ousterhout has seen this light in Tcl 8.0. btw, only the Lisp family has a consistent interface between internal and external representation, with the read/write functions. when non-Lispers try to reinvent this wheel, they get it completely wrong, such as Tcl. some don't even understand the difference between internal and external representation in the first place, so are precluded from appreciating this lesson from the Lisp family. #\Erik -- I'm no longer young enough to know everything.