Back to index

python3.2  3.2.2
Public Member Functions
test.test_tokenize.Test_Tokenize Class Reference
Inheritance diagram for test.test_tokenize.Test_Tokenize:
Inheritance graph
[legend]
Collaboration diagram for test.test_tokenize.Test_Tokenize:
Collaboration graph
[legend]

List of all members.

Public Member Functions

def test__tokenize_decodes_with_specified_encoding
def test__tokenize_does_not_decode_with_encoding_none

Detailed Description

Definition at line 678 of file test_tokenize.py.


Member Function Documentation

Definition at line 680 of file test_tokenize.py.

00680 
00681     def test__tokenize_decodes_with_specified_encoding(self):
00682         literal = '"ЉЊЈЁЂ"'
00683         line = literal.encode('utf-8')
00684         first = False
00685         def readline():
00686             nonlocal first
00687             if not first:
00688                 first = True
00689                 return line
00690             else:
00691                 return b''
00692 
00693         # skip the initial encoding token and the end token
00694         tokens = list(_tokenize(readline, encoding='utf-8'))[1:-1]
00695         expected_tokens = [(3, '"ЉЊЈЁЂ"', (1, 0), (1, 7), '"ЉЊЈЁЂ"')]
00696         self.assertEqual(tokens, expected_tokens,
00697                          "bytes not decoded with encoding")

Here is the call graph for this function:

Definition at line 698 of file test_tokenize.py.

00698 
00699     def test__tokenize_does_not_decode_with_encoding_none(self):
00700         literal = '"ЉЊЈЁЂ"'
00701         first = False
00702         def readline():
00703             nonlocal first
00704             if not first:
00705                 first = True
00706                 return literal
00707             else:
00708                 return b''
00709 
00710         # skip the end token
00711         tokens = list(_tokenize(readline, encoding=None))[:-1]
00712         expected_tokens = [(3, '"ЉЊЈЁЂ"', (1, 0), (1, 7), '"ЉЊЈЁЂ"')]
00713         self.assertEqual(tokens, expected_tokens,
00714                          "string not tokenized when encoding is None")
00715 

Here is the call graph for this function:


The documentation for this class was generated from the following file: