Skip to content

Conversation

@marcorudolphflex
Copy link
Contributor

@marcorudolphflex marcorudolphflex commented Feb 3, 2026

  • Introduces schema-versioned config files (config_version) with a migration registry and auto in-memory upgrades, including optional write-back and forward-compat handling.
  • Adds a tidy3d config upgrade CLI for dry-run diffs, CI checks, and manual schema upgrades.
  • Enforces deprecation metadata (deprecated_in, removed_in, replaced_by) during config load.
  • Updates docs and CHANGELOG to describe versioning behavior and new CLI usage.

Note

Medium Risk
Touches config load/save behavior (auto-migration, atomic writes, forward-compat handling) and CLI commands, which could affect user environments if edge cases slip through; changes are well-tested but operationally impactful.

Overview
Configuration files are now schema-versioned. Persisted TOML gains a root config_version, and ConfigLoader migrates older files in-memory to CURRENT_CONFIG_VERSION (with optional automatic write-back via TIDY3D_CONFIG_AUTO_MIGRATE).

Forward-compat + deprecation enforcement were added. Newer config_version files are handled in best-effort mode (dropping/normalizing unknown shapes) or raise in strict mode (TIDY3D_CONFIG_FORWARD_COMPAT=strict), and config load now warns/errors based on field metadata (deprecated_in, removed_in, replaced_by).

New migration/upgrade APIs and CLI. Introduces tidy3d.config.migrations (+ register_migration export) and adds tidy3d config upgrade to dry-run diff, CI --check, or apply upgrades; CLI config-related commands were moved out of web/cli/app.py into web/cli/config.py, with docs/tests updated accordingly.

Written by Cursor Bugbot for commit 0e4477a. This will update automatically on new commits. Configure here.

@github-actions
Copy link
Contributor

github-actions bot commented Feb 3, 2026

Diff Coverage

Diff: origin/develop...HEAD, staged and unstaged changes

  • tidy3d/config/init.py (100%)
  • tidy3d/config/loader.py (91.2%): Missing lines 94,126-127,192-193,233
  • tidy3d/config/manager.py (100%)
  • tidy3d/config/migrations.py (71.3%): Missing lines 49-50,52,57-58,61-63,65-66,103-106,118-119,126,128,140,147-150,154-155,183,187-188,190,194-195,199,201-208,210,214-215,226-230,234,241,250,258-259,261-262,290-291,311-312,318-321,345
  • tidy3d/web/init.py (100%)
  • tidy3d/web/cli/app.py (100%)
  • tidy3d/web/cli/config.py (68.2%): Missing lines 49-57,108,206-208,223,242-243,253-254,280-281,298-300,314-317,342,351,358-359,363-364,368,383,387-388,425-426,435-437,441,446,458-463,466-467,474-477,479-487,491-492,496-497,501-507,509-511,513,533

Summary

  • Total: 556 lines
  • Missing: 151 lines
  • Coverage: 72%

tidy3d/config/loader.py

Lines 90-98

  90                 )
  91                 return legacy
  92 
  93         if legacy:
! 94             return self._migrate_legacy_payload(legacy)
  95         return {}
  96 
  97     def load_user_profile(self, profile: str) -> dict[str, Any]:
  98         """Load user profile overrides (if any)."""

Lines 122-131

  122         if not data:
  123             if profile_path.exists():
  124                 profile_path.unlink()
  125             self._docs.pop(profile_path, None)
! 126             self._versions.pop(profile_path, None)
! 127             self._pending_writes.pop(profile_path, None)
  128             return
  129         profile_path.parent.mkdir(mode=0o700, parents=True, exist_ok=True)
  130         self._atomic_write(profile_path, data)

Lines 188-197

  188 
  189         for path, document in list(self._pending_writes.items()):
  190             try:
  191                 self._atomic_write_document(path, document)
! 192             except Exception as exc:
! 193                 log.warning(f"Failed to write migrated configuration file '{path}': {exc}")
  194             finally:
  195                 self._pending_writes.pop(path, None)
  196 
  197     def write_document(self, path: Path, document: tomlkit.TOMLDocument) -> None:

Lines 229-237

  229         return self._apply_schema_migrations(path, data, document)
  230 
  231     def _migrate_legacy_payload(self, data: dict[str, Any]) -> dict[str, Any]:
  232         if not data:
! 233             return {}
  234         document = tomlkit.parse(toml.dumps(data))
  235         apply_migrations(document, 0, CURRENT_CONFIG_VERSION)
  236         set_config_version(document, CURRENT_CONFIG_VERSION)
  237         migrated = toml.loads(tomlkit.dumps(document))

tidy3d/config/migrations.py

Lines 45-70

  45     """Return the config version stored in a dict or TOML document."""
  46 
  47     if isinstance(source, tomlkit.TOMLDocument):
  48         raw = source.get(CONFIG_VERSION_KEY)
! 49     elif isinstance(source, dict):
! 50         raw = source.get(CONFIG_VERSION_KEY)
  51     else:
! 52         raw = None
  53 
  54     if raw is None:
  55         return 0
  56     if isinstance(raw, bool):
! 57         log.warning(f"Invalid '{CONFIG_VERSION_KEY}' value {raw!r}; falling back to version 0.")
! 58         return 0
  59     try:
  60         version = int(raw)
! 61     except (TypeError, ValueError):
! 62         log.warning(f"Invalid '{CONFIG_VERSION_KEY}' value {raw!r}; falling back to version 0.")
! 63         return 0
  64     if version < 0:
! 65         log.warning(f"Invalid '{CONFIG_VERSION_KEY}' value {version!r}; falling back to version 0.")
! 66         return 0
  67     return version
  68 
  69 
  70 def set_config_version(document: tomlkit.TOMLDocument, version: int) -> None:

Lines 99-110

   99         return True
  100     value = raw.strip().lower()
  101     if value in {"0", "false", "no", "off"}:
  102         return False
! 103     if value in {"1", "true", "yes", "on"}:
! 104         return True
! 105     log.warning(f"Unrecognized '{AUTO_MIGRATE_ENV}' value {raw!r}; defaulting to auto-migrate.")
! 106     return True
  107 
  108 
  109 def forward_compat_mode() -> str:
  110     """Return the forward-compat behavior for newer config versions."""

Lines 114-123

  114         return FORWARD_COMPAT_BEST_EFFORT
  115     value = raw.strip().lower()
  116     if value in {FORWARD_COMPAT_STRICT, FORWARD_COMPAT_BEST_EFFORT}:
  117         return value
! 118     log.warning(f"Unrecognized '{FORWARD_COMPAT_ENV}' value {raw!r}; defaulting to best-effort.")
! 119     return FORWARD_COMPAT_BEST_EFFORT
  120 
  121 
  122 def apply_migrations(document: tomlkit.TOMLDocument, from_version: int, to_version: int) -> None:
  123     """Apply registered migrations to the document."""

Lines 122-132

  122 def apply_migrations(document: tomlkit.TOMLDocument, from_version: int, to_version: int) -> None:
  123     """Apply registered migrations to the document."""
  124 
  125     if from_version >= to_version:
! 126         return
  127     if from_version < 0:
! 128         from_version = 0
  129     _ensure_migration_chain(to_version)
  130     for version in range(from_version, to_version):
  131         for migrator in _MIGRATIONS.get(version, []):
  132             migrator(document)

Lines 136-144

  136     """Drop unknown keys from a config payload using the registered schemas."""
  137 
  138     sections = get_sections()
  139     if not sections:
! 140         return strip_config_version(data)
  141 
  142     filtered: dict[str, Any] = {}
  143     for key, value in data.items():
  144         if key == CONFIG_VERSION_KEY:

Lines 143-159

  143     for key, value in data.items():
  144         if key == CONFIG_VERSION_KEY:
  145             continue
  146         if key == "plugins":
! 147             filtered_plugins = _filter_plugins(value, sections)
! 148             if filtered_plugins is not None:
! 149                 filtered["plugins"] = filtered_plugins
! 150             continue
  151 
  152         schema = sections.get(key)
  153         if schema is None:
! 154             filtered[key] = value
! 155             continue
  156         if isinstance(value, dict):
  157             filtered[key] = _filter_section_data(schema, value)
  158         else:
  159             log.warning(

Lines 179-219

  179         else:
  180             payload = data.get(name, {})
  181 
  182         if not isinstance(payload, dict):
! 183             payload = {}
  184         check_deprecations(schema, payload, (name,))
  185         try:
  186             schema(**payload)
! 187         except Exception as exc:
! 188             errors.append(exc)
  189     if errors:
! 190         raise errors[0]
  191 
  192 
  193 def _filter_plugins(value: Any, sections: dict[str, type[BaseModel]]) -> Optional[dict[str, Any]]:
! 194     if not isinstance(value, dict):
! 195         log.warning(
  196             "Configuration section 'plugins' should be a table; "
  197             "ignoring non-table value during best-effort parsing."
  198         )
! 199         return None
  200 
! 201     filtered: dict[str, Any] = {}
! 202     for plugin_name, plugin_data in value.items():
! 203         schema = sections.get(f"plugins.{plugin_name}")
! 204         if schema is None:
! 205             filtered[plugin_name] = plugin_data
! 206             continue
! 207         if isinstance(plugin_data, dict):
! 208             filtered[plugin_name] = _filter_section_data(schema, plugin_data)
  209         else:
! 210             log.warning(
  211                 f"Configuration plugin section '{plugin_name}' should be a table; "
  212                 "ignoring non-table value during best-effort parsing."
  213             )
! 214             filtered[plugin_name] = {}
! 215     return filtered
  216 
  217 
  218 def _filter_section_data(schema: type[BaseModel], data: dict[str, Any]) -> dict[str, Any]:
  219     filtered: dict[str, Any] = {}

Lines 222-238

  222             continue
  223         value = data[field_name]
  224         nested_model = _resolve_model_type(field.annotation)
  225         if nested_model is not None:
! 226             if isinstance(value, dict):
! 227                 filtered[field_name] = _filter_section_data(nested_model, value)
! 228                 continue
! 229             if isinstance(value, list):
! 230                 filtered[field_name] = [
  231                     _filter_section_data(nested_model, item) if isinstance(item, dict) else item
  232                     for item in value
  233                 ]
! 234                 continue
  235         filtered[field_name] = value
  236     return filtered
  237 

Lines 237-245

  237 
  238 
  239 def _resolve_model_type(annotation: Any) -> Optional[type[BaseModel]]:
  240     if isinstance(annotation, type) and issubclass(annotation, BaseModel):
! 241         return annotation
  242 
  243     origin = get_origin(annotation)
  244     if origin is None:
  245         return None

Lines 246-254

  246 
  247     for arg in get_args(annotation):
  248         nested = _resolve_model_type(arg)
  249         if nested is not None:
! 250             return nested
  251     return None
  252 
  253 
  254 def _normalize_version(value: Any, path: str, label: str) -> Optional[int]:

Lines 254-266

  254 def _normalize_version(value: Any, path: str, label: str) -> Optional[int]:
  255     if value is None:
  256         return None
  257     if isinstance(value, bool) or not isinstance(value, int):
! 258         log.warning(f"Ignoring invalid {label}={value!r} on '{path}'.")
! 259         return None
  260     if value < 0:
! 261         log.warning(f"Ignoring invalid {label}={value!r} on '{path}'.")
! 262         return None
  263     return value
  264 
  265 
  266 def check_deprecations(

Lines 286-295

  286                 schema_extra.get("removed_in"), field_path, "removed_in"
  287             )
  288             replaced_by = schema_extra.get("replaced_by")
  289             if deprecated_in is not None and removed_in is not None:
! 290                 if removed_in < deprecated_in + 2:
! 291                     log.warning(
  292                         f"Deprecation metadata for '{field_path}' violates the minimum window "
  293                         f"(removed_in={removed_in}, deprecated_in={deprecated_in}).",
  294                         log_once=True,
  295                     )

Lines 307-316

  307 
  308         nested_model = _resolve_model_type(field.annotation)
  309         nested_value = data.get(field_name)
  310         if nested_model is not None:
! 311             if isinstance(nested_value, dict):
! 312                 check_deprecations(
  313                     nested_model,
  314                     nested_value,
  315                     (*prefix, field_name),
  316                     current_version=active_version,

Lines 314-325

  314                     nested_value,
  315                     (*prefix, field_name),
  316                     current_version=active_version,
  317                 )
! 318             elif isinstance(nested_value, list):
! 319                 for item in nested_value:
! 320                     if isinstance(item, dict):
! 321                         check_deprecations(
  322                             nested_model,
  323                             item,
  324                             (*prefix, field_name),
  325                             current_version=active_version,

Lines 341-349

  341 
  342 def _validate_migration_chain(target_version: int) -> None:
  343     for version in range(target_version):
  344         if version not in _MIGRATIONS or not _MIGRATIONS[version]:
! 345             raise RuntimeError(f"Missing config migration step for v{version} -> v{version + 1}.")
  346 
  347 
  348 @register_migration(0)
  349 def _migrate_v0_to_v1(document: tomlkit.TOMLDocument) -> None:

tidy3d/web/cli/config.py

Lines 45-61

  45     str
  46         The description for the config command.
  47     """
  48 
! 49     try:
! 50         apikey = config.web.apikey
! 51     except AttributeError:
! 52         return ""
! 53     if apikey is None:
! 54         return ""
! 55     if hasattr(apikey, "get_secret_value"):
! 56         return apikey.get_secret_value()
! 57     return str(apikey)
  58 
  59 
  60 @click.command()
  61 @click.option("--apikey", prompt=False, help="Tidy3D API key")

Lines 104-112

  104         Whether to verify SSL certificates
  105     enable_caching : bool
  106         Whether to enable result caching
  107     """
! 108     configure_fn(
  109         apikey,
  110         nexus_url,
  111         api_endpoint,
  112         website_endpoint,

Lines 202-212

  202             return
  203 
  204     # Handle API key prompt if not provided and no Nexus-only config
  205     if not apikey and not has_nexus_config:
! 206         current_apikey = get_description()
! 207         message = f"Current API key: [{current_apikey}]\n" if current_apikey else ""
! 208         apikey = click.prompt(f"{message}Please enter your api key", type=str)
  209 
  210     # Build updates dictionary for web section
  211     web_updates = {}

Lines 219-227

  219     if website_endpoint:
  220         web_updates["website_endpoint"] = website_endpoint
  221 
  222     if s3_region is not None:
! 223         web_updates["s3_region"] = s3_region
  224 
  225     if ssl_verify is not None:
  226         web_updates["ssl_verify"] = ssl_verify

Lines 238-247

  238     if apikey:
  239 
  240         def auth(req: requests.Request) -> requests.Request:
  241             """Enrich auth information to request."""
! 242             req.headers[HEADER_APIKEY] = apikey
! 243             return req
  244 
  245         # Determine validation endpoint
  246         validation_endpoint = api_endpoint if api_endpoint else str(config.web.api_endpoint)
  247         validation_ssl = ssl_verify if ssl_verify is not None else config.web.ssl_verify

Lines 249-258

  249         target_url = f"{validation_endpoint.rstrip('/')}/apikey"
  250 
  251         try:
  252             resp = requests.get(target_url, auth=auth, verify=validation_ssl)
! 253         except (requests.exceptions.SSLError, ssl.SSLError):
! 254             resp = requests.get(target_url, auth=auth, verify=False)
  255 
  256         if resp.status_code != 200:
  257             click.echo(
  258                 f"Error: API key validation failed against endpoint: {validation_endpoint}\n"

Lines 276-285

  276                 config.update_section("web", **nexus_updates)
  277                 config.save()
  278         else:
  279             # Non-nexus config: save everything to base config
! 280             config.update_section("web", **web_updates)
! 281             config.save()
  282 
  283         if has_nexus_config:
  284             # Set nexus as the default profile when nexus is configured
  285             config.set_default_profile("nexus")

Lines 294-304

  294                 "\nDefault profile set to 'nexus'. Tidy3D will now use these endpoints by default."
  295             )
  296             click.echo("To switch back to production, run: tidy3d configure --restore-defaults")
  297         else:
! 298             click.echo("Configuration saved successfully.")
! 299     elif not apikey and not has_nexus_config:
! 300         click.echo("No configuration changes to apply.")
  301 
  302 
  303 @click.command(name="config-reset")
  304 @click.option("--yes", is_flag=True, help="Do not prompt before resetting the configuration.")

Lines 310-321

  310 def config_reset(yes: bool, preserve_profiles: bool) -> None:
  311     """Reset tidy3d configuration files to the default annotated state."""
  312 
  313     if not yes:
! 314         message = "Reset configuration to defaults?"
! 315         if not preserve_profiles:
! 316             message += " This will delete user profiles."
! 317         click.confirm(message, abort=True)
  318 
  319     manager = get_manager()
  320     manager.reset_to_defaults(include_profiles=not preserve_profiles)
  321     click.echo("Configuration reset to defaults.")

Lines 338-346

  338     if profiles:
  339         for profile in profiles:
  340             path = profiles_dir / f"{profile}.toml"
  341             if not path.exists():
! 342                 raise click.ClickException(f"Profile '{profile}' not found at '{path}'.")
  343             targets.append(path)
  344         return targets
  345 
  346     base_path = config_dir / "config.toml"

Lines 347-355

  347     if base_path.exists():
  348         targets.append(base_path)
  349 
  350     if profiles_dir.exists():
! 351         targets.extend(sorted(profiles_dir.glob("*.toml")))
  352     return targets
  353 
  354 
  355 def _preview_schema_upgrade(path: Path) -> dict[str, Any]:

Lines 354-372

  354 
  355 def _preview_schema_upgrade(path: Path) -> dict[str, Any]:
  356     try:
  357         text = path.read_text(encoding="utf-8")
! 358     except Exception as exc:
! 359         raise click.ClickException(f"Failed to read '{path}': {exc}") from exc
  360 
  361     try:
  362         document = tomlkit.parse(text)
! 363     except Exception as exc:
! 364         raise click.ClickException(f"Failed to parse '{path}': {exc}") from exc
  365 
  366     version = get_config_version(document)
  367     if version > CURRENT_CONFIG_VERSION:
! 368         return {
  369             "path": path,
  370             "version": version,
  371             "forward": True,
  372             "changed": False,

Lines 379-392

  379         apply_migrations(document, version, CURRENT_CONFIG_VERSION)
  380         set_config_version(document, CURRENT_CONFIG_VERSION)
  381         after = tomlkit.dumps(document)
  382     else:
! 383         after = text
  384 
  385     try:
  386         data = toml.loads(after)
! 387     except Exception as exc:
! 388         raise click.ClickException(f"Failed to decode migrated '{path}': {exc}") from exc
  389 
  390     validate_config_data(strip_config_version(data))
  391     return {
  392         "path": path,

Lines 421-430

  421 
  422     loader = ConfigLoader()
  423     targets = _collect_upgrade_targets(loader.config_dir, profiles)
  424     if not targets:
! 425         click.echo("No configuration files found to upgrade.")
! 426         return
  427 
  428     forward_mode = forward_compat_mode()
  429     changed_paths: list[Path] = []
  430     forward_paths: list[Path] = []

Lines 431-450

  431 
  432     for path in targets:
  433         result = _preview_schema_upgrade(path)
  434         if result["forward"]:
! 435             forward_paths.append(path)
! 436             if forward_mode == FORWARD_COMPAT_STRICT or check:
! 437                 raise click.ClickException(
  438                     f"Configuration file '{path}' targets config_version {result['version']}, "
  439                     f"which is newer than supported version {CURRENT_CONFIG_VERSION}."
  440                 )
! 441             click.echo(
  442                 f"Warning: '{path}' targets config_version {result['version']} "
  443                 f"(current={CURRENT_CONFIG_VERSION}); skipping upgrade.",
  444                 err=True,
  445             )
! 446             continue
  447 
  448         if result["changed"]:
  449             changed_paths.append(path)
  450             if dry_run:

Lines 454-471

  454             elif not check and result["document"] is not None:
  455                 loader.write_document(path, result["document"])
  456 
  457     if check:
! 458         if changed_paths:
! 459             raise click.ClickException("Configuration upgrade required.")
! 460         if forward_paths:
! 461             raise click.ClickException("Configuration files target a newer schema version.")
! 462         click.echo("Configuration files are up to date.")
! 463         return
  464 
  465     if dry_run and not changed_paths and not forward_paths:
! 466         click.echo("Configuration files are already up to date.")
! 467         return
  468 
  469     if not dry_run and changed_paths:
  470         click.echo(f"Upgraded {len(changed_paths)} configuration file(s).")

Lines 470-517

  470         click.echo(f"Upgraded {len(changed_paths)} configuration file(s).")
  471 
  472 
  473 def _run_config_migration(overwrite: bool, delete_legacy: bool) -> None:
! 474     legacy_dir = legacy_config_directory()
! 475     if not legacy_dir.exists():
! 476         click.echo("No legacy configuration directory found at '~/.tidy3d'; nothing to migrate.")
! 477         return
  478 
! 479     canonical_dir = canonical_config_directory()
! 480     try:
! 481         destination = migrate_legacy_config(overwrite=overwrite, remove_legacy=delete_legacy)
! 482     except FileExistsError:
! 483         if delete_legacy:
! 484             try:
! 485                 shutil.rmtree(legacy_dir)
! 486             except OSError as exc:
! 487                 click.echo(
  488                     f"Destination '{canonical_dir}' already exists and the legacy directory "
  489                     f"could not be removed. Error: {exc}"
  490                 )
! 491                 return
! 492             click.echo(
  493                 f"Destination '{canonical_dir}' already exists. "
  494                 "Skipped copying legacy files and removed the legacy '~/.tidy3d' directory."
  495             )
! 496             return
! 497         click.echo(
  498             f"Destination '{canonical_dir}' already exists. "
  499             "Use '--overwrite' to replace the existing files."
  500         )
! 501         return
! 502     except RuntimeError as exc:
! 503         click.echo(str(exc))
! 504         return
! 505     except FileNotFoundError:
! 506         click.echo("No legacy configuration directory found; nothing to migrate.")
! 507         return
  508 
! 509     click.echo(f"Configuration migrated to '{destination}'.")
! 510     if delete_legacy:
! 511         click.echo("The legacy '~/.tidy3d' directory was removed.")
  512     else:
! 513         click.echo(
  514             f"The legacy directory remains at '{legacy_dir}'. "
  515             "Remove it after confirming the new configuration works, or rerun with '--delete-legacy'."
  516         )

Lines 529-537

  529 )
  530 def config_migrate(overwrite: bool, delete_legacy: bool) -> None:
  531     """Copy configuration files from '~/.tidy3d' to the canonical location."""
  532 
! 533     _run_config_migration(overwrite, delete_legacy)
  534 
  535 
  536 @click.group()
  537 def config_group() -> None:

@marcorudolphflex marcorudolphflex force-pushed the FXC-5275-emcore-0001-schema-versioned-config-migrations-for-tidy-3-d-config branch 2 times, most recently from 2bd65c2 to 4db1b88 Compare February 9, 2026 12:02
migrated = toml.loads(tomlkit.dumps(document))
migrated = strip_config_version(migrated)
self._pending_writes[path] = document
return migrated
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Config data lost when tomlkit parsing fails during migration

Medium Severity

In _read_toml, if tomlkit.parse fails but toml.loads succeeds, document becomes an empty TOMLDocument while data contains the actual config. In _apply_schema_migrations, the version is correctly read from data (via document or data fallback), but migrations are then applied to the empty document. The returned migrated dict is derived from this empty document rather than data, causing all configuration values to be silently discarded.

Additional Locations (1)

Fix in Cursor Fix in Web

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 2 potential issues.

Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.

@marcorudolphflex marcorudolphflex force-pushed the FXC-5275-emcore-0001-schema-versioned-config-migrations-for-tidy-3-d-config branch from 915c4a5 to 0e4477a Compare February 10, 2026 09:25
Copy link
Collaborator

@yaugenst-flex yaugenst-flex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @marcorudolphflex this is great overall! There are not really any blocking issues (except maybe the toml parsing thing) but I noticed that this introduces a bit of coupling between modules that hopefully could be avoided. While reviewing I also noticed that probably the config module could be further refactored to reduce coupling overall because there is quite a bit, however that seems out of scope for now. I just commented on some low hanging fruit.

Comment on lines 216 to 220
try:
document = tomlkit.parse(text)
except Exception as exc:
log.warning(f"Failed to parse configuration file '{path}': {exc}")
document = tomlkit.document()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the parsing fails, we set document = tomlkit.document() and continue, and then _apply_schema_migrations() will run migrations against this empty document. This can drop all user config keys, and then when we write back a file that will contain only config_version. So in this case there will be data loss for the user from a failed parse.
Not sure how to handle.. error? Or skip the migration?

Comment on lines 87 to 89
log.warning(
f"Failed to auto-migrate legacy configuration: {exc}. "
"Using legacy data without migration."
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: at this point, legacy has already been passed through _migrate_legacy_payload(), so the returned data is migrated.

if check:
if changed_paths:
raise click.ClickException("Configuration upgrade required.")
if forward_paths:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this unreachable? since forward-version files already raise when check=True?

from tidy3d.log import log

from .loader import ConfigLoader, deep_diff, deep_merge, load_environment_overrides
from .migrations import _resolve_model_type, check_deprecations
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we avoid importing _resolve_model_type from migrations? This couples the manager to the manager internals. Maybe we can create a helper module like schema_utils?

return value


def check_deprecations(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is used by config loading and model building, but it lives in the migrations module. Could we move it to a dedicated deprecations.py so migrations.py stays focused on .. migrations 😆

return targets


def _preview_schema_upgrade(path: Path) -> dict[str, Any]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems to mostly reimplement the migration/validation that's already used in runtime loading, so I think a shared helper could reduce drift here?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants