-
-
Notifications
You must be signed in to change notification settings - Fork 49
Description
Summary
Config resolution performance is critical for ESLint's effectiveness in large monorepos and real-time editor integrations (VSCode, IntelliJ, etc.). While @eslint/eslintrc has basic directory-based config caching, there are several high-impact optimization opportunities that could dramatically improve performance in production environments.
Context
In large monorepos with hundreds/thousands of files:
- Editors repeatedly resolve configs for the same files during typing
- CI/CD systems resolve configs for files in similar directory structures
- Each config resolution involves expensive operations (file I/O, module loading, pattern matching, object merging)
Performance Impact
Testing with a fork implementing basic config file caching shows:
- 42.64x speedup (4163.5% improvement) for repeated config resolutions
- 951,192 configs/sec (cached) vs 22,310 configs/sec (uncached)
- Benchmark: 10,000 config resolutions across typical directory hierarchies
Proposed Optimizations
1. Config File Loading Cache
Current behavior: Config files (.eslintrc.js, .eslintrc.json, etc.) are re-loaded from disk on every directory traversal, even for the same file path.
Opportunity:
- Cache parsed config file contents by absolute path
- Invalidate cache when files change (via file watchers or manual invalidation)
- Avoid redundant
fs.readFileSync(),JSON.parse(),yaml.load()calls
Implementation notes:
- Cache at
ConfigArrayFactorylevel - Key: absolute file path
- Value: parsed config data
- Provide
clearConfigCache()for manual invalidation - Consider LRU eviction for memory management
Reference implementation: See jdmiranda/eslintrc@perf/config-cache
2. Plugin Resolution Cache
Current behavior: Plugins are resolved and loaded via require() on every config that references them (config-array-factory.js:1076-1110).
Opportunity:
- Cache resolved plugin modules by plugin name + resolution base path
require()calls are expensive (module resolution, parsing, execution)- Plugins are typically immutable once loaded
Impact: In a monorepo with 100 files using the same plugin, this eliminates 99 redundant require() calls per plugin.
Implementation notes:
const pluginCache = new Map(); // key: `${pluginName}@${basePath}`, value: loaded plugin
_loadPlugin(name, ctx) {
const cacheKey = `${name}@${ctx.pluginBasePath}`;
if (pluginCache.has(cacheKey)) {
return pluginCache.get(cacheKey);
}
// ... existing resolution logic ...
pluginCache.set(cacheKey, dependency);
return dependency;
}3. Config Merge Optimization
Current behavior: Config merging creates new objects and performs deep cloning (config-array.js:135-156).
Opportunity:
- Reduce object allocations during merge operations
- Use structural sharing where possible
- Consider copy-on-write strategies for frequently merged configs
Impact: In deep directory hierarchies (10+ levels), configs are merged repeatedly. Reducing allocations improves both CPU and memory usage.
Implementation notes:
- Profile
mergeWithoutOverwrite()to identify hot paths - Consider caching merged config results keyed by config element combinations
- Explore immutable data structures for structural sharing
4. Glob Pattern Matching Cache
Current behavior: Override patterns are tested against file paths using minimatch on every config extraction (override-tester.js:189-198).
Opportunity:
- Cache pattern test results:
Map<pattern_hash + file_path, boolean> - Minimatch pattern compilation is already cached, but match results are not
- In editors, the same file is tested against the same patterns repeatedly
Implementation notes:
class OverrideTester {
constructor() {
this.matchCache = new Map(); // key: relativePath, value: boolean
}
test(filePath) {
const relativePath = path.relative(this.basePath, filePath);
if (this.matchCache.has(relativePath)) {
return this.matchCache.get(relativePath);
}
const result = this.patterns.every(/* ... */);
this.matchCache.set(relativePath, result);
return result;
}
}5. Config Validation Cache
Current behavior: Schema validation runs on every config extraction (config-validator.js).
Opportunity:
- Cache validation results by config content hash
- JSON schema validation via AJV is CPU-intensive
- Most configs are validated repeatedly without changes
Implementation notes:
- Generate content hash (fast hash function, not cryptographic)
- Cache validation outcomes:
Map<content_hash, validation_result> - Clear cache on config changes
6. File Watcher Integration
Current behavior: No automatic cache invalidation when config files change. Applications must manually call clearCache().
Opportunity:
- Provide optional file watcher integration for automatic cache invalidation
- Watch config files (
.eslintrc.*,package.json) and invalidate caches when modified - Particularly valuable for editor integrations and development servers
Implementation notes:
class CascadingConfigArrayFactory {
enableFileWatching() {
const watcher = chokidar.watch([
'**/.eslintrc.*',
'**/package.json'
], { cwd: this.cwd });
watcher.on('change', (filePath) => {
this.invalidateCache(filePath);
});
}
}Considerations:
- Make optional (opt-in via configuration)
- Use efficient watchers (e.g.,
chokidar, nativefs.watch) - Clean up watchers on factory disposal
Use Case Impact
VSCode ESLint Extension
- Config resolution happens on every file edit/save
- Current: repeated file I/O and parsing overhead
- With optimizations: near-instant config resolution after warmup
Monorepo CI/CD
- 1000+ files linted in parallel or sequence
- Current: redundant plugin loading and config parsing
- With optimizations: 40x+ faster config resolution phase
Development Servers (Next.js, Vite, etc.)
- Config resolution on every hot-reload
- Current: noticeable lag in large projects
- With optimizations: imperceptible latency
Backward Compatibility
All proposed optimizations can be implemented as internal improvements without breaking changes:
- Cache management is transparent to consumers
- Existing
clearCache()API remains functional - File watching is opt-in
Questions for Maintainers
- Priority: Which optimizations would provide the most value for the ESLint ecosystem?
- File watching: Is there appetite for built-in file watching, or should this remain an application-level concern?
- Memory management: Should caches have configurable size limits or LRU eviction?
- Contribution: Would you accept a phased PR implementing these optimizations?
References
- Fork with config file caching: https://github.com/jdmiranda/eslintrc/tree/perf/config-cache
- Benchmark implementation: https://github.com/jdmiranda/eslintrc/blob/perf/config-cache/benchmark.js
- Related discussion: https://github.com/eslint/eslint/discussions (if applicable)
I'm happy to contribute PRs for any/all of these optimizations based on maintainer feedback and priorities.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status