A token counting and analysis extension for the tourist-core library. Automatically extracts, counts, and reports tokens (like SQL operations, API calls, etc.) from method execution traces captured by tourist-core.
Tourist-token builds on tourist-core's method interception capabilities to provide token-based analysis of execution flows. It counts occurrences of specific tokens (extracted from shots) and generates hierarchical reports showing token distribution across method calls.
Use Cases:
- Count SQL statement types (SELECT, INSERT, UPDATE, DELETE) across method calls
- Track API endpoint invocations
- Monitor cache hits/misses
- Analyze any token-based patterns in execution flows
- Token Extraction: Multiple strategies for extracting tokens from shots
- Self extractor (shot picture as token)
- Regex-based extraction with capture groups
- Token Decorators: Chain transformations on extracted tokens
- Case conversion (upper/lower)
- Filtering (whitelist)
- Mapping (token replacement)
- Null handling
- Hierarchical Counting: Aggregates token counts across nested method calls
- Multiple Output Formats: Write reports to console, files, or custom outputs
- Async Writing: Optional asynchronous I/O for minimal performance impact
- Thread-Safe: Isolated token counting per execution thread
- Java 6 or higher
- tourist-core 1.1.0-SNAPSHOT
- JUnit 4.12 (for tests)
- EasyMock 3.4 (for tests)
Basic Extraction:
// Self extractor - shot picture becomes the token
SelfTokenExtractor extractor = new SelfTokenExtractor();
// Regex extractor - extract from patterns
RegexTokenExtractor extractor = new RegexTokenExtractor();
extractor.setRegex("SQL: (\\w+) .*"); // Extracts SQL operation type
extractor.setGroup(1); // Use first capture groupToken Decorators (Chainable):
// Convert to uppercase
UpperTokenExtractorDecorator upperDecorator = new UpperTokenExtractorDecorator();
upperDecorator.setTokenExtractor(baseExtractor);
// Filter to specific tokens
FilterTokenExtractorDecorator filterDecorator = new FilterTokenExtractorDecorator();
Set<String> allowedTokens = new HashSet<>(Arrays.asList("SELECT", "INSERT", "UPDATE"));
filterDecorator.setTokenSet(allowedTokens);
filterDecorator.setDefaultToken("OTHER");
filterDecorator.setTokenExtractor(upperDecorator);
// Map tokens to categories
MapTokenExtractorDecorator mapDecorator = new MapTokenExtractorDecorator();
Map<String, String> tokenMap = new HashMap<>();
tokenMap.put("SELECT", "READ");
tokenMap.put("INSERT", "WRITE");
mapDecorator.setTokenMap(tokenMap);
mapDecorator.setTokenExtractor(filterDecorator);// Create the listener
TokenCounterTourEventListener listener = new TokenCounterTourEventListener();
// Configure token extraction
listener.setTokenExtractor(yourTokenExtractor);
// Configure output
listener.setTokenCounterNodeWriter(yourWriter);
// Add to tourist
Set<TourEventListener> listeners = new LinkedHashSet<>();
listeners.add(listener);
tourist.setTourEventListenerSet(listeners);Console Writer:
ConsoleTokenCounterNodeWriter writer = new ConsoleTokenCounterNodeWriter();
writer.setSerializer(new TextTokenCounterNodeSerializer());File Writer:
FileTokenCounterNodeWriter writer = new FileTokenCounterNodeWriter();
writer.setFile("/path/to/output.txt");
writer.setAppend(true); // Append to existing file
writer.setAsync(true); // Write asynchronously
writer.setSerializer(new TextTokenCounterNodeSerializer());
writer.init(); // Initialize before use
// When done
writer.destroy(); // Clean up resourcesCustom Writer:
public class CustomWriter extends IOTokenCounterNodeWriter {
@Override
public Writer createWriter() throws IOException {
return new StringWriter(); // or any Writer implementation
}
}The default TextTokenCounterNodeSerializer produces hierarchical reports:
[com.example.Service.processData()][SELECT=5, INSERT=2][1245 ms]
[com.example.dao.UserDao.findUsers()][SELECT=3][456 ms]
[com.example.dao.UserDao.saveUser()][INSERT=1][123 ms]
[com.example.dao.OrderDao.findOrders()][SELECT=2, INSERT=1][666 ms]
Format: [fully.qualified.Class.method()][token1=count, token2=count][duration ms]
For failed methods:
[com.example.Service.processData()][java.io.IOException]
// 1. Setup token extraction pipeline
SelfTokenExtractor baseExtractor = new SelfTokenExtractor();
UpperTokenExtractorDecorator upperDecorator = new UpperTokenExtractorDecorator();
upperDecorator.setTokenExtractor(baseExtractor);
FilterTokenExtractorDecorator filterDecorator = new FilterTokenExtractorDecorator();
Set<String> sqlOps = new HashSet<>(Arrays.asList("SELECT", "INSERT", "UPDATE", "DELETE"));
filterDecorator.setTokenSet(sqlOps);
filterDecorator.setDefaultToken("OTHER");
filterDecorator.setTokenExtractor(upperDecorator);
// 2. Setup output writer
FileTokenCounterNodeWriter writer = new FileTokenCounterNodeWriter();
writer.setFile("token-report.txt");
writer.setAppend(true);
writer.setAsync(true);
writer.setSerializer(new TextTokenCounterNodeSerializer());
writer.init();
// 3. Create and configure listener
TokenCounterTourEventListener listener = new TokenCounterTourEventListener();
listener.setTokenExtractor(filterDecorator);
listener.setTokenCounterNodeWriter(writer);
// 4. Add to tourist configuration
TouristImpl tourist = new TouristImpl();
Set<TourEventListener> listeners = new LinkedHashSet<>();
listeners.add(listener);
tourist.setTourEventListenerSet(listeners);
// 5. Use in your code
Camera camera = tourist.getCamera();
if (camera.isOn()) {
camera.shot("SELECT"); // Token will be counted
camera.shot("INSERT");
}
// 6. Cleanup when done
writer.destroy();The shot picture itself becomes the token. Useful when you control the shot content.
camera.shot("SELECT"); // Token: "SELECT"
camera.shot("INSERT"); // Token: "INSERT"Extract tokens using regular expressions with capture groups.
RegexTokenExtractor extractor = new RegexTokenExtractor();
extractor.setRegex("Operation: (\\w+)");
extractor.setGroup(1);
// Shot: "Operation: SELECT" → Token: "SELECT"UpperTokenExtractorDecorator: Convert to uppercase
// "select" → "SELECT"LowerTokenExtractorDecorator: Convert to lowercase
// "SELECT" → "select"FilterTokenExtractorDecorator: Whitelist tokens
// Allowed: {"READ", "WRITE"}
// "READ" → "READ"
// "DELETE" → null (or defaultToken if set)MapTokenExtractorDecorator: Map tokens to categories
// Map: {"SELECT" → "READ", "INSERT" → "WRITE"}
// "SELECT" → "READ"
// "INSERT" → "WRITE"NullReplacerTokenExtractorDecorator: Replace null tokens
// null → "UNKNOWN"Represents a method invocation with its token counts:
tour: The Tour instance (method context)tokenCountedMap: Map<String, MutableInteger> of token countschildren: List of child nodes (nested method calls)
A mutable integer wrapper for efficient counting:
MutableInteger counter = new MutableInteger(); // value = 1
counter.increment(); // value = 2
counter.incrementBy(5); // value = 7Tourist-token uses ThreadLocal storage for:
- Node stacks (tracking nested calls)
- Root nodes (per-thread execution trees)
This ensures complete isolation between concurrent executions.
- Async Writing: Enable
setAsync(true)on file writers to minimize I/O overhead - Regex Complexity: Simple patterns perform better than complex ones
- Decorator Chains: Each decorator adds minimal overhead, but keep chains reasonable
- Token Map Size: Large token maps or sets have memory implications
mvn clean installNote: Requires tourist-core to be installed in your local Maven repository first.
mvn test<bean id="tokenExtractor" class="io.tourist.token.extractor.SelfTokenExtractor"/>
<bean id="upperDecorator" class="io.tourist.token.extractor.decorator.UpperTokenExtractorDecorator">
<property name="tokenExtractor" ref="tokenExtractor"/>
</bean>
<bean id="fileWriter" class="io.tourist.token.writer.FileTokenCounterNodeWriter"
init-method="init" destroy-method="destroy">
<property name="file" value="token-report.txt"/>
<property name="append" value="true"/>
<property name="async" value="true"/>
<property name="serializer">
<bean class="io.tourist.token.serializer.TextTokenCounterNodeSerializer"/>
</property>
</bean>
<bean id="tokenListener" class="io.tourist.token.listener.TokenCounterTourEventListener">
<property name="tokenExtractor" ref="upperDecorator"/>
<property name="tokenCounterNodeWriter" ref="fileWriter"/>
</bean>
<bean id="myTourist" parent="tourist-parent">
<property name="tourEventListenerSet">
<set>
<ref bean="tokenListener"/>
</set>
</property>
</bean>- extractor: Token extraction strategies and decorators
- listener: Event listener for token counting
- model: Data structures (TokenCounterNode, MutableInteger)
- serializer: Output formatting
- writer: Output destinations (console, file, I/O)
See project documentation for licensing information.