summaryrefslogtreecommitdiff
path: root/tests/tokens.py
diff options
context:
space:
mode:
authorOwen Jacobson <owen@grimoire.ca>2017-11-11 01:51:06 -0500
committerOwen Jacobson <owen@grimoire.ca>2017-11-11 15:42:13 -0500
commit16d94a6e50eb81de9d9d438e1cce0746928597f3 (patch)
treee1cb628d34c49690128722a33cc1d19d7dcffb23 /tests/tokens.py
parente4fb8604aa2fc572a3aeeace1c32de7339d346b5 (diff)
Introduce input ports.
Ports are the lisp abstraction of files and streams. Actinide ports additionally guarantee a peek operation. This makes ``tokenize`` (now ``read_token``) callable as a lisp function, as it takes a port and reads one token from it. This is a substantial refactoring. As most of the state is now captured by closures, it's no longer practical to test individual states as readily. However, the top-level tokenizer tests exercise the full state space.
Diffstat (limited to 'tests/tokens.py')
-rw-r--r--tests/tokens.py4
1 files changed, 4 insertions, 0 deletions
diff --git a/tests/tokens.py b/tests/tokens.py
index 0027fb2..3eb58b8 100644
--- a/tests/tokens.py
+++ b/tests/tokens.py
@@ -48,6 +48,10 @@ def whitespace_characters():
def tokens():
return one_of(symbols(), strings(), open_parens(), close_parens())
+# Generates a string which may not be empty, but which does not contain a token.
+def nontokens():
+ return one_of(whitespace(), comments(), just(''))
+
# Generates at least one character of whitespace.
def whitespace():
return text(whitespace_characters(), min_size=1)