Results 1 to 3 of 3

Thread: Tokenizer-Bug?

  1. #1
    thinBasic MVPs
    Join Date
    Oct 2012
    Location
    Germany
    Age
    54
    Posts
    1,525
    Rep Power
    170

    Tokenizer-Bug?

    I have some problems with classic tokenizer. It does not stop at the end. Does the End of the String not count as EOL automatic?

    Make sure the last line of this example is a comment.

    ' #Filename "test_TokenizerFinished.tBasicU"
    
    
    Uses "console", "tokenizer"
    
    ' --------------------------------------------------------------------
    Function TBMain()
    ' --------------------------------------------------------------------
       ' init tokenizer:
      Tokenizer_Default_Char("#", %TOKENIZER_DEFAULT_ALPHA)
      Tokenizer_Default_Char("$", %TOKENIZER_DEFAULT_ALPHA)
      Tokenizer_Default_Char("%", %TOKENIZER_DEFAULT_ALPHA)
      Tokenizer_Default_Char(".", %TOKENIZER_DEFAULT_DELIM)
      
      PrintL test(Load_File(APP_SourceFullName))
      
      PrintL "done. Key to end"
      WaitKey
        
      
    End Function    
    
    ' --------------------------------------------------------------------
    Function test(ByVal sCode As String) As String
    ' --------------------------------------------------------------------
       ' usually this function scans for a ' #Filename "name.tBasic[U]"
       ' the situation was similar,
       ' within the script to test there was no 
       ' #filename found in my project.
       
       ' i simply replaced the token to search for by a dummy 
      
      Local sToken             As String
      Local lPos, lMain, lSafe As Long
      
      lPos = 1
      Do  ' Until lPos >= StrPtrLen(StrPtr(sCode))
          ' would solve the situation  
        lSafe = lPos    ' trap Error
        
        Tokenizer_GetNextToken(sCode, lPos, lMain, sToken)
        PrintL "Token " & sToken
       
        ' this would solve the situation but discards last token:
        
        ' If lPos > StrPtrLen(StrPtr(sCode)) Then 
        '  lMain = %TOKENIZER_FINISHED 
        ' EndIf
       
        Select Case lMain
          Case %TOKENIZER_FINISHED 
            PrintL "no dummy found"
            Exit Do
          Case %TOKENIZER_ERROR
            PrintL "token-error"
            lPos = lSafe
            Tokenizer_MoveToEol(sCode, lPos)
          
          Case %TOKENIZER_DELIMITER
            Print "delimiter "
            
            If sToken = "'" Then
              Tokenizer_GetNextToken(sCode, lPos, lMain, sToken)
              PrintL "' " & sToken
              
              If Ucase$(sToken) = "#DUMMY" Then
              '   no its not so I omit this... 
              EndIf
            EndIf
            Tokenizer_MoveToEol(sCode, lPos)   
    ' at the end of sCode lPos should be higher than Len(sCode) 
    ' and if requesting a lPos higher than Len(sCode)
    ' lMain should hold %Tokenizer_Finished
          
          Case %TOKENIZER_EOL 
            PrintL "EOL"
            Nop
          Case Else
            PrintL "move to EOL"
            Tokenizer_MoveToEol(sCode, lPos)
        End Select
      Loop
    
      Function = "none found"
     
      PrintL "End test()"
    
    End Function 
    
    ' comment in the end !                   
    ' comment in the end !                   
    ' comment in the end of the code to test!
    
    Weird guessing:

    I know that i request more tokens after finding the '-delimter and send to EOL.
    There's no EOL in the end i guess??? Does it always return to EOL of the previous line?

    If not- shouldn't lMain hold %Tokenizer_Finished in the next loop
    if lPos > Strptrlen(strptr(sCode)) then?
    Last edited by ReneMiner; 23-01-2016 at 12:22.
    I think there are missing some Forum-sections as beta-testing and support

  2. #2
    thinBasic author ErosOlmi's Avatar
    Join Date
    Sep 2004
    Location
    Milan - Italy
    Age
    57
    Posts
    8,777
    Rep Power
    10
    Well, it is not really a bug but ...
    a special unexpected situations of using Tokenizer_GetNextToken in conjunction with Tokenizer_MoveToEol

    I will have a look.
    Last edited by ErosOlmi; 23-01-2016 at 16:49.
    www.thinbasic.com | www.thinbasic.com/community/ | help.thinbasic.com
    Windows 10 Pro for Workstations 64bit - 32 GB - Intel(R) Xeon(R) W-10855M CPU @ 2.80GHz - NVIDIA Quadro RTX 3000

  3. #3
    thinBasic author ErosOlmi's Avatar
    Join Date
    Sep 2004
    Location
    Milan - Italy
    Age
    57
    Posts
    8,777
    Rep Power
    10
    Hope to have fixed the ... not bug

    Attached a new tokenizer module to substitute the one in \thinBasic\Lib\

    Let me know
    Eros
    Attached Files Attached Files
    www.thinbasic.com | www.thinbasic.com/community/ | help.thinbasic.com
    Windows 10 Pro for Workstations 64bit - 32 GB - Intel(R) Xeon(R) W-10855M CPU @ 2.80GHz - NVIDIA Quadro RTX 3000

Similar Threads

  1. Tokenizer- user-keys, but how ???
    By ReneMiner in forum Tokenizer
    Replies: 24
    Last Post: 25-11-2015, 23:07
  2. Tokenizer: Configurizable?
    By Michael Hartlef in forum Tokenizer
    Replies: 12
    Last Post: 02-05-2007, 22:20

Members who have read this thread: 0

There are no members to list at the moment.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •