-
Notifications
You must be signed in to change notification settings - Fork 240
PSSA scheduling change #2266
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
PSSA scheduling change #2266
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR refactors the PSSA (PowerShell Script Analyzer) scheduling mechanism from a single global queue to per-file queues with cancellation support. This prevents analysis jobs from accumulating when PSSA rules take longer than the 750ms debounce delay, significantly improving responsiveness during active editing.
Key changes:
- Replaced global cancellation token with per-file cancellation tokens stored in CorrectionTableEntry
- Modified
DelayThenInvokeDiagnosticsAsyncto handle individual files with their own analysis state - Enhanced logging to include elapsed time for analysis operations
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 8 comments.
| File | Description |
|---|---|
| src/PowerShellEditorServices/Services/Analysis/AnalysisService.cs | Core refactoring to implement per-file analysis queues with cancellation tokens, synchronization logic using Interlocked operations, and new task management pattern |
| src/PowerShellEditorServices/Services/Analysis/PssaCmdletAnalysisEngine.cs | Added stopwatch timing to log analysis duration for debugging purposes |
| test/PowerShellEditorServices.Test/Services/Symbols/PSScriptAnalyzerTests.cs | Updated tests to work with the new API signature that requires CorrectionTableEntry parameter |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
src/PowerShellEditorServices/Services/Analysis/AnalysisService.cs
Outdated
Show resolved
Hide resolved
| CancellationTokenSource cancellationSource = new(); | ||
| CancellationTokenSource oldTaskCancellation = Interlocked.Exchange(ref fileAnalysisEntry.CancellationSource, cancellationSource); | ||
| if (oldTaskCancellation is not null) | ||
| { | ||
| try | ||
| { | ||
| oldTaskCancellation.Cancel(); | ||
| oldTaskCancellation.Dispose(); | ||
| } | ||
| catch (Exception e) | ||
| { | ||
| _logger.LogError(e, "Exception occurred while cancelling analysis task"); | ||
| } | ||
| } | ||
|
|
||
| _ = Task.Run(() => DelayThenInvokeDiagnosticsAsync(file, fileAnalysisEntry), cancellationSource.Token); |
Copilot
AI
Dec 19, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's a potential race condition where the CancellationTokenSource could be disposed (line 153) while it's still being used to create the Task.Run on line 161. Consider holding a local reference to the cancellation token before potentially disposing the source, or ensure the token is extracted before disposal.
| CancellationTokenSource cancellationSource = new(); | ||
| CancellationTokenSource oldTaskCancellation = Interlocked.Exchange(ref fileAnalysisEntry.CancellationSource, cancellationSource); | ||
| if (oldTaskCancellation is not null) | ||
| { | ||
| try | ||
| { | ||
| oldTaskCancellation.Cancel(); | ||
| oldTaskCancellation.Dispose(); | ||
| } | ||
| catch (Exception e) | ||
| { | ||
| _logger.LogError(e, "Exception occurred while cancelling analysis task"); | ||
| } | ||
| } | ||
|
|
||
| _ = Task.Run(() => DelayThenInvokeDiagnosticsAsync(file, fileAnalysisEntry), cancellationSource.Token); | ||
| } |
Copilot
AI
Dec 19, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The new per-file cancellation behavior introduced in this change lacks test coverage. Consider adding tests that verify: 1) A new analysis request cancels the pending request for the same file, 2) Multiple files can be analyzed concurrently without interfering with each other, and 3) The cancellation mechanism properly cleans up resources.
| // delay period while the second one is ticking away. | ||
|
|
||
| foreach (ScriptFile scriptFile in filesToAnalyze) | ||
| TaskCompletionSource<ScriptFileMarker[]> placeholder = new TaskCompletionSource<ScriptFileMarker[]>(); |
Copilot
AI
Dec 19, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The TaskCompletionSource should be created with TaskCreationOptions.RunContinuationsAsynchronously to prevent potential deadlocks. Without this option, any continuations attached to placeholder.Task will run synchronously on the thread that completes the task (via SetResult or SetException), which could lead to performance issues or blocking behavior.
| TaskCompletionSource<ScriptFileMarker[]> placeholder = new TaskCompletionSource<ScriptFileMarker[]>(); | |
| TaskCompletionSource<ScriptFileMarker[]> placeholder = new TaskCompletionSource<ScriptFileMarker[]>(TaskCreationOptions.RunContinuationsAsynchronously); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does #1838 apply here?
src/PowerShellEditorServices/Services/Analysis/AnalysisService.cs
Outdated
Show resolved
Hide resolved
| public ConcurrentDictionary<string, IEnumerable<MarkerCorrection>> Corrections { get; } | ||
|
|
||
| public Task DiagnosticPublish { get; set; } | ||
| public Task DiagnosticPublish; |
Copilot
AI
Dec 19, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Field 'DiagnosticPublish' can be 'readonly'.
| public Task DiagnosticPublish; | |
| public readonly Task DiagnosticPublish; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DiagnosticPublish gets modified by Interlocked.CompareExchange on line 371 of AnalysisService.cs
| public Task DiagnosticPublish { get; set; } | ||
| public Task DiagnosticPublish; | ||
|
|
||
| public CancellationTokenSource CancellationSource; |
Copilot
AI
Dec 19, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Field 'CancellationSource' can be 'readonly'.
| public CancellationTokenSource CancellationSource; | |
| public readonly CancellationTokenSource CancellationSource; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CancellationSource gets modified by Interlocked.Exchange on line 147 of AnalysisService.cs
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
PR Summary
Change from a single PSSA queue to per-file queues. When a new analysis request comes in, it cancels all pending analysis requests for the same file.
Fixes #2260
PR Context
This change helps if you have PSSA rules that take more than 750ms to run. As the code is being edited this will prevent the number of pending analysis jobs from growing out of control.
Interactive test / repro
Set up a new VSCode workspace:
.vscode\settings.json{ "powershell.developer.editorServicesLogLevel": "Trace", "powershell.trace.server": "verbose", "powershell.scriptAnalysis.settingsPath": ".\\PSScriptAnalyzerSettings.psd1" }PSScriptAnalyzerSettings.psd1SlowRule.psm1Compile from commit
548704cto have PSSA timing information in the debug logs.Open
SlowRule.psm1and type e.g. "12345" in the comment, leaving about a second (importantly, more than the 750ms debounce delay) between each keypress. Notice how according to the logs each script analysis run takes longer and longer. What this also shows is that after you stop typing it takes a long time to analyze the code that you're seeing at on the screen.Compiling from the tip of the branch and repeating the same test results in:
Runtimes are now ~same because analysis jobs are not stuck waiting for a PS runspace. And secondly, only two script analysis runs happened because "3", "4" and "5" keypresses had the chance to cancel the previous one that was waiting for "1" keypress analysis to finish.