Euphoria Ticket #439: tokenize fails [N..M]

recent breakage

tokens = tokenize_string("abc = x[1..5]") 
tokenize:string_numbers() 
 
test_equal("tok_parse 1..5 #1",   "1", tokens[1][5][TDATA]) 
test_equal("tok_parse 1..5 #2",   "..", tokens[1][6][TDATA]) 
test_equal("tok_parse 1..5 #3",   "5", tokens[1][7][TDATA]) 
 
tokens = tokenize_string("abc = x[1..$]") 
tokenize:string_numbers() 
 
test_equal("tok_parse 1..$ #1",   "$", tokens[1][5][TDATA]) 
-- test_equal("tok_parse 1..$ #1",   "$", tokens[1][7][TDATA]) 
 

Details

Type: Bug Report Severity: Major Category: Library Routine
Assigned To: jeremy Status: Fixed Reported Release: 4279
Fixed in SVN #: 4285, 4286 View VCS: 4285, 4286 Milestone: 4.0.0RC2

1. Comment by jeremy Nov 24, 2010

Added tests into tests/t_tokenize.e

2. Comment by jeremy Nov 24, 2010

Fixed slice operator when string numbers is enabled.

Search



Quick Links

User menu

Not signed in.

Misc Menu