recent breakage
tokens = tokenize_string("abc = x[1..5]")
tokenize:string_numbers()
test_equal("tok_parse 1..5 #1", "1", tokens[1][5][TDATA])
test_equal("tok_parse 1..5 #2", "..", tokens[1][6][TDATA])
test_equal("tok_parse 1..5 #3", "5", tokens[1][7][TDATA])
tokens = tokenize_string("abc = x[1..$]")
tokenize:string_numbers()
test_equal("tok_parse 1..$ #1", "$", tokens[1][5][TDATA])
-- test_equal("tok_parse 1..$ #1", "$", tokens[1][7][TDATA])
Fixed slice operator when string numbers is enabled.