The punctuation/symbol distinction was introduced by Unicode and retrofitted into older character sets.  Unicode doesn't officially explain the difference, but punctuation marks are part of a given script (though the Latin ones have spread very widely now), whereas symbols are basically script-independent.  There is some inherent vagueness: / is considered punctuation despite its specialized use in programming as a math symbol.  I agree that the distinction is not functional in ASCII. 

However, in order to avoid confusion, I suggest you change the SRFI terminology to "other" rather than "punctuation", and define it negatively as "characters that are neither controls, letters, digits, nor space".  This would change the name of ascii-punctuation? to ascii-other?.

On Fri, Nov 22, 2019 at 7:48 AM Lassi Kortela <xxxxxx@lassi.io> wrote:
After reviewing SRFI 14
(https://srfi.schemers.org/srfi-14/srfi-14.html#StandardCharsets)
there's more confusing terminology:

char-set:punctuation    !"#%&'()*,-./:;?@[\]_{}
char-set:symbol         $+<=>^`|~

Apparently ASCII makes a distinction between "punctuation" and "symbol"
characters. A table on Wikipedia confirms that:
<https://en.wikipedia.org/wiki/ASCII#Character_set>.

The distinction seems arbitrary. It's likely to cause confusion and be
of little use to Scheme programmers. Does Unicode make such a
distinction and does merging both of the above character classes into
one "punctuation" class make sense?

If we put ascii-symbol? into SRFI 174, people would probably think it
means the set of characters allowed in Scheme symbols.