Skip to content

Migrate supabase import to JSR#596

Open
jon-bell wants to merge 5 commits intostagingfrom
migrate-edge-function-imports
Open

Migrate supabase import to JSR#596
jon-bell wants to merge 5 commits intostagingfrom
migrate-edge-function-imports

Conversation

@jon-bell
Copy link
Contributor

@jon-bell jon-bell commented Jan 25, 2026

Summary by CodeRabbit

  • New Features

    • End-to-end testing support with special test-token handling and test-repo identification.
    • Branch-protection audit & remediation script for GitHub repositories.
    • Expanded test utilities and a new end-to-end test validating submission feedback flows.
  • Refactor

    • Unified module resolution for server functions to a single resolver.
    • Centralized E2E token validation into a shared implementation.
  • Bug Fixes

    • Database migration adding a function to fix error-pin creation and matching logic.

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 25, 2026

Walkthrough

Centralizes E2E OIDC token validation in the shared GitHub wrapper, migrates many Supabase imports from esm.sh to jsr: specifiers, removes several deno.json import mappings, adds a branch-protection audit/fix script, extends E2E test utilities and an end-to-end feedback test, and adds a DB migration to fix error_pin foreign-key/logic.

Changes

Cohort / File(s) Summary
Shared module imports
supabase/functions/_shared/ChimeWrapper.ts, supabase/functions/_shared/EnrollmentUtils.ts, supabase/functions/_shared/GitHubSyncHelpers.ts, supabase/functions/_shared/HandlerUtils.ts, supabase/functions/_shared/InvitationUtils.ts
Swapped @supabase/supabase-js imports from esm.sh URLs to jsr:@supabase/supabase-js@2; no logic changes.
GitHub E2E token handling
supabase/functions/_shared/GitHubWrapper.ts
Added END_TO_END_REPO_PREFIX, local END_TO_END_SECRET, and exported validateOIDCTokenOrAllowE2E(token) which bypasses normal OIDC validation for specially-prefixed E2E repo tokens and otherwise delegates to standard validation.
Autograder token updates
supabase/functions/autograder-create-submission/index.ts, supabase/functions/autograder-submit-feedback/index.ts
Replaced local OIDC decoding/secret logic with imports validateOIDCTokenOrAllowE2E and END_TO_END_REPO_PREFIX from shared GitHubWrapper; updated imports accordingly.
Mass import migration (core functions & scripts)
many supabase/functions/* files (assignments, autograders, calendar, enrollments, github-, gradebook-, live-meeting-, metrics, notification-queue-processor, submission-serve-artifact, user-fetch-azure-profile, scripts/, etc.)
Uniformly replaced @supabase/supabase-js imports from esm.sh to jsr: specifiers; behavior unchanged.
Deno manifest cleanup
supabase/functions/*/deno.json (course-import-sis, discord-async-worker, github-async-worker, gradebook-column-recalculate, invitation-create, metrics, user-fetch-azure-profile)
Removed @supabase/supabase-js entries from deno.json import maps (files now empty or reduced).
Gradebook expression utilities
supabase/functions/gradebook-column-recalculate/expression/DependencySource.ts
Added extractScalarValues helper; changed min to accept unknown[] and added max counterpart to support mixed inputs (numbers, student objects, arrays).
Scripts
supabase/functions/scripts/CheckBranchProtection.ts (new), supabase/functions/scripts/ClearCaches.ts, supabase/functions/scripts/PushChangesToRepoFromHandout.ts, supabase/functions/scripts/TriggerBulkSubmissions.ts
Added new CheckBranchProtection.ts (audit + optional fix for GitHub branch protection with concurrency, rate-limiting, Sentry); other scripts mainly had import spec changes.
E2E test utilities & tests
tests/e2e/TestingUtils.ts, tests/e2e/create-submission.test.tsx
Added GradingScriptResult and GradeResponse types and exported submitFeedbackViaAPI() and createSampleGradingResult(); added an E2E test that posts grading feedback via API and verifies DB rows.
Config change
supabase/config.toml
verify_jwt for functions.github-repo-configure-webhook changed from true to false.
DB migration
supabase/migrations/20260129000000_fix_error_pin_created_by_fkey.sql
Added save_error_pin(p_error_pin jsonb, p_rules jsonb) RETURNS jsonb PL/pgSQL function to fix FK usage, enforce class/permission validations, and populate error_pin_submission_matches.

Sequence Diagram(s)

sequenceDiagram
    participant Client as Client / Test
    participant GitHubWrapper as GitHubWrapper
    participant GitHubAPI as GitHub OIDC / Keys

    Client->>GitHubWrapper: validateOIDCTokenOrAllowE2E(token)
    GitHubWrapper->>GitHubWrapper: decode token header & payload
    GitHubWrapper->>GitHubWrapper: repo startsWith END_TO_END_REPO_PREFIX?
    alt E2E token (prefix match)
        GitHubWrapper->>GitHubWrapper: verify header.kid == END_TO_END_SECRET
        GitHubWrapper-->>Client: return decoded payload
    else Standard token
        GitHubWrapper->>GitHubAPI: fetch/verify OIDC keys and validate token
        GitHubAPI-->>GitHubWrapper: validation result
        GitHubWrapper-->>Client: return decoded payload
    end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Suggested labels

backend, devops, testing

Suggested reviewers

  • ironm00n

Poem

A shared gate for tokens, imports realigned,
Branch rules checked, some manifests resigned.
Tests that post feedback, DB fixes applied,
Small tidy changes so the flows coincide.

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 12.50% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately and concisely summarizes the main change: migrating Supabase imports from esm.sh CDN URLs to JSR module specifiers across the codebase.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🤖 Fix all issues with AI agents
In `@supabase/functions/_shared/GitHubWrapper.ts`:
- Around line 625-647: The current END_TO_END_SECRET uses a guessable default
which allows bypassing validation; change validateOIDCTokenOrAllowE2E to require
an explicit secret (or explicit opt‑in) instead of defaulting to "not-a-secret":
remove the fallback in END_TO_END_SECRET and read the env var strictly, and in
validateOIDCTokenOrAllowE2E (referencing END_TO_END_REPO_PREFIX and header.kid)
fail closed by throwing a SecurityError if the secret env var is missing/empty
(or if an explicit E2E_ENABLE flag is not set) before accepting any E2E token.
Ensure the function logs or throws a clear error when the secret/opt-in is not
present so E2E bypass cannot be used in production.

In `@supabase/functions/assignment-create-all-repos/index.ts`:
- Line 1: The import for Supabase client currently uses a floating major
constraint "jsr:`@supabase/supabase-js`@2"; change it to a specific pinned version
(e.g., "jsr:`@supabase/supabase-js`@2.x.y") or reference the project's import map
entry so it matches the project's versioning strategy (align with other pins
like ^2.4.5); update the import statement at the top of the file (the
createClient import) to use the chosen exact version or import map key.

In `@supabase/functions/scripts/CheckBranchProtection.ts`:
- Around line 344-355: The comment says "max 50/min" but the Bottleneck instance
named rateLimiter is configured with reservoir and reservoirRefreshAmount set to
100; update the Bottleneck configuration in CheckBranchProtection.ts (the
rateLimiter creation) to match the comment by changing reservoir and
reservoirRefreshAmount to 50 (or alternatively update the comment to reflect 100
if that is intended) and keep maxConcurrent as-is.

In `@tests/e2e/TestingUtils.ts`:
- Around line 1638-1644: Remove the leftover debug statement that prints stack
traces during tests by deleting the console.trace(data) call inside the
error-handling block that checks if ("error" in data) (the branch that inspects
data.error and throws a new Error using data.error.details); leave the
surrounding logic and throws intact so the function still throws the appropriate
Error when data.error is present.
- Around line 1538-1585: Remove the duplicated GradeResponse type declaration
and instead import the existing GradeResponse type from
supabase/functions/_shared/FunctionTypes.d.ts; keep the local
GradingScriptResult declaration as-is if the expanded feedback shape is
intentionally different, otherwise refactor GradingScriptResult to reference the
shared AutograderFeedback type from FunctionTypes.d.ts (use the type name
AutograderFeedback and GradeResponse to locate the shared definitions and update
the imports at the top of Tests/e2e/TestingUtils.ts accordingly).
♻️ Duplicate comments (9)
supabase/functions/scripts/PushChangesToRepoFromHandout.ts (1)

13-13: Same jsr import verification as above.

Please apply the same runtime compatibility check for jsr:@supabase/supabase-js@2 in this script’s execution environment.

supabase/functions/assignment-create-handout-repo/index.ts (1)

1-1: Same jsr import verification as above.

Please apply the same runtime compatibility check for jsr:@supabase/supabase-js@2 here as well.

supabase/functions/enrollments-sync-canvas/index.ts (1)

6-6: Same jsr import verification as above.

Please apply the same runtime compatibility check for jsr:@supabase/supabase-js@2 here as well.

supabase/functions/scripts/ClearCaches.ts (1)

2-2: Same jsr import verification as above.

Please apply the same runtime compatibility check for jsr:@supabase/supabase-js@2 here as well.

supabase/functions/gradebook-column-recalculate/index.ts (1)

2-2: Duplicate: jsr Supabase import verification.

Same verification as earlier regarding jsr resolver support and version pinning.

jsr "@supabase/supabase-js" versioning policy and Deno/Supabase Edge Functions support for jsr: specifiers
supabase/functions/gradebook-column-recalculate/GradebookProcessor.ts (1)

1-1: Duplicate: jsr Supabase import verification.

Same verification as earlier regarding jsr resolver support and version pinning.

jsr "@supabase/supabase-js" versioning policy and Deno/Supabase Edge Functions support for jsr: specifiers
supabase/functions/_shared/ChimeWrapper.ts (1)

5-7: Duplicate: jsr Supabase import verification.

Same verification as earlier regarding jsr resolver support and version pinning.

jsr "@supabase/supabase-js" versioning policy and Deno/Supabase Edge Functions support for jsr: specifiers
supabase/functions/gradebook-column-recalculate/BatchProcessor.ts (1)

1-1: Duplicate: jsr Supabase import verification.

Same verification as earlier regarding jsr resolver support and version pinning.

jsr "@supabase/supabase-js" versioning policy and Deno/Supabase Edge Functions support for jsr: specifiers
supabase/functions/metrics/index.ts (1)

2-2: Duplicate: jsr Supabase import verification.

Same verification as earlier regarding jsr resolver support and version pinning.

jsr "@supabase/supabase-js" versioning policy and Deno/Supabase Edge Functions support for jsr: specifiers
🧹 Nitpick comments (7)
supabase/functions/discord-async-worker/index.ts (1)

2-2: Consider aligning the PostgREST Json type import with jsr to avoid mixed registries.
If the jsr package (or a local alias) can provide the same type, it’ll reduce dependency variability.

supabase/functions/github-repo-configure-webhook/index.ts (1)

4-4: Optional: align the remaining Supabase-related type import with jsr for consistency.
Since you’ve moved the client to jsr, consider migrating the Json type import too if an equivalent jsr path exists.

supabase/functions/scripts/CheckBranchProtection.ts (4)

35-36: Inconsistent import sources with the JSR migration goal.

The PR objective is to migrate Supabase imports to JSR, but RequestError and Bottleneck are still imported from esm.sh. While these packages may not be available on JSR, it's worth noting this inconsistency.


222-223: Magic delay for ruleset propagation.

The 1-second delay is a reasonable approach for allowing the ruleset to propagate, but this could be flaky in high-latency environments. Consider making this configurable or documenting why this specific value was chosen.


385-401: Direct mutation of status object.

The code mutates the status object from indexedResults directly. While this works for this CLI script, it's a side effect that could be confusing. Consider creating a new object or being more explicit about the update.


483-486: Consider awaiting the main function call.

The main() function is async but isn't awaited at the entry point. While Deno handles unhandled promise rejections, explicitly awaiting ensures proper error propagation and process exit codes.

Suggested fix
 // Run the main function
 if (import.meta.main) {
-  main();
+  await main();
 }
supabase/functions/autograder-create-submission/index.ts (1)

30-30: Import Json from the local SupabaseTypes.d.ts instead of esm.sh.

The Json type is already exported from SupabaseTypes.d.ts (line 1), which is being used consistently in other functions like autograder-submit-feedback and _shared/FunctionTypes.d.ts. Update this import to align with the migration pattern and remove the esm.sh dependency.

Comment on lines +625 to +647
// E2E testing constants and helper
export const END_TO_END_REPO_PREFIX = "pawtograder-playground/test-e2e-student-repo";
const END_TO_END_SECRET = Deno.env.get("END_TO_END_SECRET") || "not-a-secret";

/**
* Validates an OIDC token, or allows E2E test tokens that use the special prefix.
* For E2E runs, we don't validate the signature but check that the secret matches.
*/
export async function validateOIDCTokenOrAllowE2E(token: string): Promise<GitHubOIDCToken> {
const decoded = decode(token);
const payload = decoded[1] as GitHubOIDCToken;
if (payload.repository.startsWith(END_TO_END_REPO_PREFIX)) {
const header = decoded[0] as {
alg: string;
typ: string;
kid: string;
};
if (header.kid !== END_TO_END_SECRET) {
throw new SecurityError("E2E repo provided, but secret is incorrect");
}
return payload;
}
return await validateOIDCToken(token);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Require explicit E2E secret; don’t default to a guessable value.

With the current fallback ("not-a-secret"), any token with the special repo prefix can bypass signature validation if the env var isn’t set. This is a security gap in non‑test environments. Require an explicit secret (or an explicit opt‑in flag) and fail closed when it’s missing.

🔒 Suggested fix (fail closed unless explicitly enabled)
-export const END_TO_END_REPO_PREFIX = "pawtograder-playground/test-e2e-student-repo";
-const END_TO_END_SECRET = Deno.env.get("END_TO_END_SECRET") || "not-a-secret";
+export const END_TO_END_REPO_PREFIX = "pawtograder-playground/test-e2e-student-repo";
+const END_TO_END_SECRET = Deno.env.get("END_TO_END_SECRET");
+const ALLOW_E2E_TOKENS = Deno.env.get("ALLOW_E2E_TOKENS") === "true";
@@
 export async function validateOIDCTokenOrAllowE2E(token: string): Promise<GitHubOIDCToken> {
   const decoded = decode(token);
   const payload = decoded[1] as GitHubOIDCToken;
   if (payload.repository.startsWith(END_TO_END_REPO_PREFIX)) {
+    if (!ALLOW_E2E_TOKENS || !END_TO_END_SECRET) {
+      throw new SecurityError("E2E token validation is not enabled");
+    }
     const header = decoded[0] as {
       alg: string;
       typ: string;
       kid: string;
     };
-    if (header.kid !== END_TO_END_SECRET) {
+    if (header.kid !== END_TO_END_SECRET) {
       throw new SecurityError("E2E repo provided, but secret is incorrect");
     }
     return payload;
   }
   return await validateOIDCToken(token);
 }
🤖 Prompt for AI Agents
In `@supabase/functions/_shared/GitHubWrapper.ts` around lines 625 - 647, The
current END_TO_END_SECRET uses a guessable default which allows bypassing
validation; change validateOIDCTokenOrAllowE2E to require an explicit secret (or
explicit opt‑in) instead of defaulting to "not-a-secret": remove the fallback in
END_TO_END_SECRET and read the env var strictly, and in
validateOIDCTokenOrAllowE2E (referencing END_TO_END_REPO_PREFIX and header.kid)
fail closed by throwing a SecurityError if the secret env var is missing/empty
(or if an explicit E2E_ENABLE flag is not set) before accepting any E2E token.
Ensure the function logs or throws a clear error when the secret/opt-in is not
present so E2E bypass cannot be used in production.

@@ -1,4 +1,4 @@
import { createClient } from "https://esm.sh/@supabase/supabase-js@2";
import { createClient } from "jsr:@supabase/supabase-js@2";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🌐 Web query:

jsr @supabase/supabase-js package versioning policy deno

💡 Result:

  • JSR’s versioning rules (applies to jsr:@supabase/supabase-js`` in Deno):

    • Every release is a SemVer version (MAJOR.MINOR.PATCH). You must bump the version in the package config before publishing (deno publish / jsr publish). [1]
    • Published versions are immutable (can’t be changed or deleted). [1]
    • Bad releases can be “yanked” (hidden from normal semver resolution / “latest”), but can still be fetched if explicitly pinned. [1]
  • How you typically consume it from Deno (and what that implies for updates):

    • import { createClient } from "jsr:@supabase/supabase-js@2" pins to major v2 and will float to the latest compatible minor/patch within v2 under semver resolution. [2][3]
  • Supabase’s stated runtime-support policy note for Deno (not SemVer, but relevant to “what may change in minors”):

    • Supabase says it may drop EOL Deno versions in a minor release and not treat that as a breaking change, following Deno’s stable/lts channels. [2]

Sources: JSR package version docs [1], @supabase/supabase-js on JSR (including Deno import + support policy) [2], Deno announcement showing @2 import mapping [3].


🌐 Web query:

Supabase Edge Functions jsr: specifier support Deno runtime

💡 Result:

Supabase Edge Functions run on the Deno runtime, so they do support native jsr: import specifiers (and import maps via deno.json). Supabase’s own “Managing dependencies” guide explicitly lists modules published to JSR as supported and shows an example like import path from "jsr:@std/path@1.0.8".[1]

This works because Deno natively supports jsr: specifiers (either directly in code or via deno.json import maps).[2][3] Other non-Deno runtimes typically need JSR’s npm-compat layer instead of native jsr: handling.[3]

Example in an Edge Function:

import path from "jsr:`@std/path`@1.0.8";

🏁 Script executed:

# Check if deno.json or deno.lock exists in the repository for version locking
find . -maxdepth 2 -type f \( -name "deno.json" -o -name "deno.lock" \) 2>/dev/null | head -20

Repository: pawtograder/platform

Length of output: 46


🏁 Script executed:

# Check the context of the import and see if there's a deno.json/deno.lock in the functions directory
ls -la supabase/functions/ 2>/dev/null || echo "Directory not found"

Repository: pawtograder/platform

Length of output: 3776


🏁 Script executed:

# Check deno.json to see if there's a structured approach to version pinning
cat supabase/functions/deno.json

Repository: pawtograder/platform

Length of output: 354


Use a specific version constraint for the Supabase client import to match the project's versioning strategy.

The import jsr:@supabase/supabase-js@2 uses a major-only constraint that will float to any minor/patch within v2. The project's deno.json uses more specific constraints (e.g., ^2.4.5 for @supabase/functions-js). Align with this pattern by pinning to a specific version like @2.x.y or using the import map in deno.json.

🤖 Prompt for AI Agents
In `@supabase/functions/assignment-create-all-repos/index.ts` at line 1, The
import for Supabase client currently uses a floating major constraint
"jsr:`@supabase/supabase-js`@2"; change it to a specific pinned version (e.g.,
"jsr:`@supabase/supabase-js`@2.x.y") or reference the project's import map entry
so it matches the project's versioning strategy (align with other pins like
^2.4.5); update the import statement at the top of the file (the createClient
import) to use the chosen exact version or import map key.

Comment on lines +344 to +355
// Check each repository in parallel with rate limiting
console.log("Checking branch protection rulesets...");
console.log(`Processing ${repositories.length} repositories in parallel (max 50/min)...`);
console.log("");

// Create rate limiter: 50 requests per minute
const rateLimiter = new Bottleneck({
reservoir: 100, // Number of jobs
reservoirRefreshAmount: 100, // Refill amount
reservoirRefreshInterval: 60 * 1000, // Refill every minute (60000ms)
maxConcurrent: 10 // Max concurrent operations
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Rate limiter configuration doesn't match comment.

The comment on line 346 states "max 50/min", but the actual Bottleneck configuration sets reservoir: 100 and reservoirRefreshAmount: 100, which allows 100 requests per minute.

Suggested fix
     // Check each repository in parallel with rate limiting
     console.log("Checking branch protection rulesets...");
-    console.log(`Processing ${repositories.length} repositories in parallel (max 50/min)...`);
+    console.log(`Processing ${repositories.length} repositories in parallel (max 100/min)...`);
     console.log("");

     // Create rate limiter: 50 requests per minute
     const rateLimiter = new Bottleneck({
-      reservoir: 100, // Number of jobs
-      reservoirRefreshAmount: 100, // Refill amount
+      reservoir: 50, // Number of jobs
+      reservoirRefreshAmount: 50, // Refill amount
       reservoirRefreshInterval: 60 * 1000, // Refill every minute (60000ms)
       maxConcurrent: 10 // Max concurrent operations
     });
🤖 Prompt for AI Agents
In `@supabase/functions/scripts/CheckBranchProtection.ts` around lines 344 - 355,
The comment says "max 50/min" but the Bottleneck instance named rateLimiter is
configured with reservoir and reservoirRefreshAmount set to 100; update the
Bottleneck configuration in CheckBranchProtection.ts (the rateLimiter creation)
to match the comment by changing reservoir and reservoirRefreshAmount to 50 (or
alternatively update the comment to reflect 100 if that is intended) and keep
maxConcurrent as-is.

Comment on lines +1538 to +1585
export type GradingScriptResult = {
ret_code: number;
output: string;
execution_time: number;
feedback: {
score?: number;
max_score?: number;
output: {
hidden?: { output: string; output_format?: "text" | "markdown" | "ansi" };
visible?: { output: string; output_format?: "text" | "markdown" | "ansi" };
after_due_date?: { output: string; output_format?: "text" | "markdown" | "ansi" };
after_published?: { output: string; output_format?: "text" | "markdown" | "ansi" };
};
lint: {
status: "pass" | "fail";
output: string;
output_format?: "text" | "markdown" | "ansi";
};
tests: {
score?: number;
max_score?: number;
name: string;
name_format?: "text" | "markdown" | "ansi";
output: string;
output_format?: "text" | "markdown" | "ansi";
hidden_output?: string;
hidden_output_format?: "text" | "markdown" | "ansi";
part?: string;
hide_until_released?: boolean;
}[];
};
grader_sha: string;
action_ref: string;
action_repository: string;
};

export type GradeResponse = {
is_ok: boolean;
message: string;
details_url: string;
artifacts?: {
name: string;
path: string;
token: string;
}[];
supabase_url: string;
supabase_anon_key: string;
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# First, locate the FunctionTypes.d.ts file and examine the type definitions
find . -name "FunctionTypes.d.ts" -type f

Repository: pawtograder/platform

Length of output: 111


🏁 Script executed:

# Search for GradingScriptResult type definition in the codebase
rg -n "export type GradingScriptResult" --type=ts

Repository: pawtograder/platform

Length of output: 215


🏁 Script executed:

# Search for GradeResponse type definition in the codebase
rg -n "export type GradeResponse" --type=ts

Repository: pawtograder/platform

Length of output: 203


🏁 Script executed:

# Compare the GradingScriptResult type definitions
echo "=== TestingUtils.ts GradingScriptResult ===" && \
rg -A 35 "export type GradingScriptResult" tests/e2e/TestingUtils.ts && \
echo -e "\n=== FunctionTypes.d.ts GradingScriptResult ===" && \
rg -A 35 "export type GradingScriptResult" supabase/functions/_shared/FunctionTypes.d.ts

Repository: pawtograder/platform

Length of output: 2022


🏁 Script executed:

# Compare the GradeResponse type definitions
echo "=== TestingUtils.ts GradeResponse ===" && \
rg -A 15 "export type GradeResponse" tests/e2e/TestingUtils.ts && \
echo -e "\n=== FunctionTypes.d.ts GradeResponse ===" && \
rg -A 15 "export type GradeResponse" supabase/functions/_shared/FunctionTypes.d.ts

Repository: pawtograder/platform

Length of output: 835


🏁 Script executed:

# Check for console.trace debug statement around line 1640
sed -n '1635,1645p' tests/e2e/TestingUtils.ts

Repository: pawtograder/platform

Length of output: 439


🏁 Script executed:

# Check if TestingUtils.ts imports from FunctionTypes.d.ts
rg -n "import.*FunctionTypes" tests/e2e/TestingUtils.ts

Repository: pawtograder/platform

Length of output: 46


🏁 Script executed:

# Check if GradeResponse is used elsewhere in TestingUtils.ts to understand scope
rg -n "GradeResponse" tests/e2e/TestingUtils.ts | head -20

Repository: pawtograder/platform

Length of output: 169


🏁 Script executed:

# Check what AutograderFeedback is in FunctionTypes.d.ts
rg -B 2 -A 30 "export type AutograderFeedback" supabase/functions/_shared/FunctionTypes.d.ts

Repository: pawtograder/platform

Length of output: 856


🏁 Script executed:

# Check the beginning of TestingUtils.ts to see what's imported
head -50 tests/e2e/TestingUtils.ts

Repository: pawtograder/platform

Length of output: 2201


Import GradeResponse from FunctionTypes.d.ts instead of duplicating it.

GradeResponse is already defined in supabase/functions/_shared/FunctionTypes.d.ts (line 87) and is identical to the version here (lines 1574–1585). This can be imported directly to eliminate drift.

GradingScriptResult, however, uses a different structure—the feedback property is expanded inline here rather than referencing the AutograderFeedback type from FunctionTypes.d.ts. If this expanded structure is intentional for test purposes, it should remain local; otherwise, consider aligning with the shared type definition.

🤖 Prompt for AI Agents
In `@tests/e2e/TestingUtils.ts` around lines 1538 - 1585, Remove the duplicated
GradeResponse type declaration and instead import the existing GradeResponse
type from supabase/functions/_shared/FunctionTypes.d.ts; keep the local
GradingScriptResult declaration as-is if the expanded feedback shape is
intentionally different, otherwise refactor GradingScriptResult to reference the
shared AutograderFeedback type from FunctionTypes.d.ts (use the type name
AutograderFeedback and GradeResponse to locate the shared definitions and update
the imports at the top of Tests/e2e/TestingUtils.ts accordingly).

Comment on lines +1638 to +1644
if ("error" in data) {
if (typeof data.error === "object" && data.error && "details" in data.error) {
console.trace(data);
throw new Error(String((data.error as { details: string }).details));
}
throw new Error(`Failed to submit feedback: ${JSON.stringify(data.error)}`);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Remove debug console.trace statement.

console.trace(data) at line 1640 will emit noisy stack traces during test runs. This appears to be leftover debugging code.

🐛 Proposed fix
   if ("error" in data) {
     if (typeof data.error === "object" && data.error && "details" in data.error) {
-      console.trace(data);
       throw new Error(String((data.error as { details: string }).details));
     }
     throw new Error(`Failed to submit feedback: ${JSON.stringify(data.error)}`);
   }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if ("error" in data) {
if (typeof data.error === "object" && data.error && "details" in data.error) {
console.trace(data);
throw new Error(String((data.error as { details: string }).details));
}
throw new Error(`Failed to submit feedback: ${JSON.stringify(data.error)}`);
}
if ("error" in data) {
if (typeof data.error === "object" && data.error && "details" in data.error) {
throw new Error(String((data.error as { details: string }).details));
}
throw new Error(`Failed to submit feedback: ${JSON.stringify(data.error)}`);
}
🤖 Prompt for AI Agents
In `@tests/e2e/TestingUtils.ts` around lines 1638 - 1644, Remove the leftover
debug statement that prints stack traces during tests by deleting the
console.trace(data) call inside the error-handling block that checks if ("error"
in data) (the branch that inspects data.error and throws a new Error using
data.error.details); leave the surrounding logic and throws intact so the
function still throws the appropriate Error when data.error is present.

@argos-ci
Copy link

argos-ci bot commented Jan 25, 2026

The latest updates on your projects. Learn more about Argos notifications ↗︎

Build Status Details Updated (UTC)
default (Inspect) ⚠️ Changes detected (Review) 66 changed, 1 failure Jan 29, 2026, 6:40 PM

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Fix all issues with AI agents
In `@supabase/migrations/20260129000000_fix_error_pin_created_by_fkey.sql`:
- Around line 139-167: Before processing p_rules, add a guard that checks if
jsonb_array_length(p_rules) = 0 and rule_logic = 'and' and handle it explicitly
(either RAISE EXCEPTION with a clear message or short-circuit to skip inserts
and auto-populate). Locate the proc/section that uses p_rules and
v_all_rules_match (the block that deletes from error_pin_rules, loops over
p_rules, and later deletes from error_pin_submission_matches) and insert the
check there so you do not proceed to insert zero rules or compute matches when
no rules were supplied; ensure any early exit prevents the auto-populate step
from running and documents the failure reason.
- Around line 88-117: The UPDATE can null-out or error when the JSON payload
omits fields; before UPDATE (inside the branch handling existing pins) SELECT
the current error_pins row into local variables (e.g.,
v_existing_discussion_thread_id, v_existing_assignment_id,
v_existing_class_id_already_set) and then use COALESCE between the parsed
p_error_pin values and those existing variables in the UPDATE statement (keep
conversions: ::bigint/::boolean and preserve rule_logic default). Ensure you
still enforce authorizeforclassgrader(v_existing_class_id) and derive
v_class_id/v_assignment_id as before, but fall back to the selected existing
assignment_id/class_id/discussion_thread_id when the JSON key is missing.

Comment on lines +88 to +117
-- Insert or update error_pin
IF (p_error_pin->>'id')::bigint IS NOT NULL THEN
-- Update existing pin - first verify the pin exists and belongs to a class we're authorized for
DECLARE
v_existing_class_id bigint;
BEGIN
SELECT class_id INTO v_existing_class_id
FROM error_pins
WHERE id = (p_error_pin->>'id')::bigint;

IF NOT FOUND THEN
RAISE EXCEPTION 'Error pin not found';
END IF;

-- Verify caller is authorized for the existing pin's class (prevent cross-class updates)
IF NOT authorizeforclassgrader(v_existing_class_id) THEN
RAISE EXCEPTION 'Permission denied: not authorized for this error pin';
END IF;
END;

-- Update existing pin (only allow changing discussion_thread_id, assignment_id, rule_logic, enabled)
-- class_id is derived from assignment_id (or provided for class-level), created_by is not updated
UPDATE error_pins
SET discussion_thread_id = (p_error_pin->>'discussion_thread_id')::bigint,
assignment_id = v_assignment_id,
class_id = v_class_id,
rule_logic = COALESCE(p_error_pin->>'rule_logic', 'and'),
enabled = COALESCE((p_error_pin->>'enabled')::boolean, true)
WHERE id = (p_error_pin->>'id')::bigint
RETURNING id, assignment_id INTO v_pin_id, v_assignment_id;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Update path can unintentionally null fields when keys are omitted.
If an update payload omits discussion_thread_id, assignment_id, or class_id, this code can clear those columns (or raise for missing class_id). If callers send partial updates, that’s a regression. Consider defaulting to existing values when fields are absent.

Suggested fix (preserve existing values on update)
@@
-    ELSE
-        -- For class-level pins, class_id must be provided directly
-        v_class_id := (p_error_pin->>'class_id')::bigint;
-        
-        IF v_class_id IS NULL THEN
-            RAISE EXCEPTION 'class_id is required for class-level pins';
-        END IF;
-        
-        -- Verify the class exists
-        IF NOT EXISTS (SELECT 1 FROM classes WHERE id = v_class_id) THEN
-            RAISE EXCEPTION 'Class not found';
-        END IF;
-    END IF;
+    ELSE
+        -- For class-level pins, class_id must be provided directly (on create)
+        v_class_id := (p_error_pin->>'class_id')::bigint;
+        
+        IF v_class_id IS NULL AND (p_error_pin->>'id')::bigint IS NULL THEN
+            RAISE EXCEPTION 'class_id is required for class-level pins';
+        END IF;
+        
+        -- Verify the class exists when provided
+        IF v_class_id IS NOT NULL AND NOT EXISTS (SELECT 1 FROM classes WHERE id = v_class_id) THEN
+            RAISE EXCEPTION 'Class not found';
+        END IF;
+    END IF;
@@
-        DECLARE
-            v_existing_class_id bigint;
-        BEGIN
-            SELECT class_id INTO v_existing_class_id
-            FROM error_pins
-            WHERE id = (p_error_pin->>'id')::bigint;
+        DECLARE
+            v_existing_class_id bigint;
+        BEGIN
+            SELECT * INTO v_pin_record
+            FROM error_pins
+            WHERE id = (p_error_pin->>'id')::bigint;
             
             IF NOT FOUND THEN
                 RAISE EXCEPTION 'Error pin not found';
             END IF;
+            
+            v_existing_class_id := v_pin_record.class_id;
+            IF v_class_id IS NULL THEN
+                v_class_id := v_existing_class_id;
+            END IF;
@@
-        UPDATE error_pins
-        SET discussion_thread_id = (p_error_pin->>'discussion_thread_id')::bigint,
-            assignment_id = v_assignment_id,
-            class_id = v_class_id,
-            rule_logic = COALESCE(p_error_pin->>'rule_logic', 'and'),
-            enabled = COALESCE((p_error_pin->>'enabled')::boolean, true)
+        UPDATE error_pins
+        SET discussion_thread_id = COALESCE((p_error_pin->>'discussion_thread_id')::bigint, v_pin_record.discussion_thread_id),
+            assignment_id = COALESCE(v_assignment_id, v_pin_record.assignment_id),
+            class_id = COALESCE(v_class_id, v_pin_record.class_id),
+            rule_logic = COALESCE(p_error_pin->>'rule_logic', v_pin_record.rule_logic, 'and'),
+            enabled = COALESCE((p_error_pin->>'enabled')::boolean, v_pin_record.enabled, true)
🤖 Prompt for AI Agents
In `@supabase/migrations/20260129000000_fix_error_pin_created_by_fkey.sql` around
lines 88 - 117, The UPDATE can null-out or error when the JSON payload omits
fields; before UPDATE (inside the branch handling existing pins) SELECT the
current error_pins row into local variables (e.g.,
v_existing_discussion_thread_id, v_existing_assignment_id,
v_existing_class_id_already_set) and then use COALESCE between the parsed
p_error_pin values and those existing variables in the UPDATE statement (keep
conversions: ::bigint/::boolean and preserve rule_logic default). Ensure you
still enforce authorizeforclassgrader(v_existing_class_id) and derive
v_class_id/v_assignment_id as before, but fall back to the selected existing
assignment_id/class_id/discussion_thread_id when the JSON key is missing.

Comment on lines +139 to +167
-- Delete old rules
DELETE FROM error_pin_rules WHERE error_pin_id = v_pin_id;

-- Insert new rules
FOR v_rule IN SELECT * FROM jsonb_array_elements(p_rules)
LOOP
INSERT INTO error_pin_rules (
error_pin_id,
target,
match_type,
match_value,
match_value_max,
test_name_filter,
ordinal
)
VALUES (
v_pin_id,
(v_rule->>'target')::error_pin_rule_target,
COALESCE(v_rule->>'match_type', 'contains'),
v_rule->>'match_value',
v_rule->>'match_value_max',
v_rule->>'test_name_filter',
COALESCE((v_rule->>'ordinal')::smallint, 0)
);
END LOOP;

-- Auto-populate: compute matches for all active submissions
-- Clear existing matches first
DELETE FROM error_pin_submission_matches WHERE error_pin_id = v_pin_id;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Guard against empty rule sets to avoid “match‑all” behavior.
With rule_logic = 'and', an empty rules array leaves v_all_rules_match = true, so every submission with a grader result will match. Add a guard before rule processing to fail fast (or explicitly short‑circuit) when no rules are provided.

Suggested guard
+    IF jsonb_typeof(p_rules) IS DISTINCT FROM 'array'
+       OR jsonb_array_length(p_rules) = 0 THEN
+        RAISE EXCEPTION 'At least one rule is required';
+    END IF;
+
     -- Delete old rules
     DELETE FROM error_pin_rules WHERE error_pin_id = v_pin_id;
🤖 Prompt for AI Agents
In `@supabase/migrations/20260129000000_fix_error_pin_created_by_fkey.sql` around
lines 139 - 167, Before processing p_rules, add a guard that checks if
jsonb_array_length(p_rules) = 0 and rule_logic = 'and' and handle it explicitly
(either RAISE EXCEPTION with a clear message or short-circuit to skip inserts
and auto-populate). Locate the proc/section that uses p_rules and
v_all_rules_match (the block that deletes from error_pin_rules, loops over
p_rules, and later deletes from error_pin_submission_matches) and insert the
check there so you do not proceed to insert zero rules or compute matches when
no rules were supplied; ensure any early exit prevents the auto-populate step
from running and documents the failure reason.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant