Re: Octet vs Char (Re: strings draft) Paul Schlie 27 Jan 2004 00:40 UTC

Which is the very reason that specifying a character encoding for scheme's
character-set breaks most existing implementations', and programs hosted
on those implementations, assumption of i/o port and string encoding
neutrality.

As because the standard "goes of it's way *not* to assume a particular
character encoding and repertoire", it enables implementations to adopt
the host's native character encoding as it's own; thereby forming an
equivalency between characters and raw-native-byte data, thereby enabling
the use of scheme in a wide variety of applications which require raw byte
i/o, and data butters (using scheme's native i/o ports and strings to do
so).

This alone would seem to be a good reason to continue not specifying
scheme's character encoding, enabling it to remain equivalent to the host's
native-byte data encoding. Where then, new standard scheme procedures can be
defined to encode and manipulate encoded data stored within native-byte
scheme strings as required for whatever specific purpose desired; and/or
alternatively, distinct "world aware" character and string types, with
whatever magical properties they may require could be defined, independently
of the existing encoding-agnostic byte-oriented i/o ports and strings.

-paul-

bear <xxxxxx@sonic.net> wrote:
> Hear, Hear.  The standard goes out of its way to *not* assume
> a particular character encoding and repertoire; it follows that
> code relying on a particular character encoding in order to do
> binary I/O is nonportable.