Skip to content

Conversation

@amrit110
Copy link
Member

PR Type

Fixes all mypy errors

@amrit110 amrit110 requested review from a-kore and Copilot May 27, 2025 13:49
@amrit110 amrit110 self-assigned this May 27, 2025
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes all mypy errors by refactoring certain import paths, updating type hints, and adding or removing type ignore comments where needed. Key changes include:

  • Adjusting import statements for Trainer and TrainingArguments across training scripts.
  • Updating type hint usage and adding explicit type ignores in model modules and utility functions.
  • Revising tensor initializations and padding methods to improve type safety.

Reviewed Changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
scripts/training/run_atom3d.py Updated import paths and added type ignores to resolve type-checking issues.
scripts/training/pretrain_s2ef.py Similar import refactors and ignore annotations to fix mypy errors.
atomgen/models/tokengt.py Revised transformer configuration initialization with adjusted type ignores.
atomgen/models/schnet.py Removed unnecessary type ignores and updated config assignment.
atomgen/models/modeling_atomformer.py Changed loss initialization and added type ignores to better align with types.
atomgen/models/configuration_atomformer.py Removed extraneous type ignores in configuration initialization.
atomgen/data/utils.py Added tuple checks and explicit type casts for metric computations.
atomgen/data/tokenizer.py Improved type hints and adjusted pad functions with more specific annotations.
atomgen/data/data_collator.py Refactored tokenizer import and added type ignore annotations on assignments.
Comments suppressed due to low confidence (1)

atomgen/models/schnet.py:137

  • Verify that suppressing type checks for the config_class assignment is necessary; if possible, update type definitions to obviate the need for an ignore comment.
config_class = SchNetConfig  # type: ignore[assignment]


# Initialize trainer
trainer = Trainer(
trainer = Trainer( # type: ignore[no-untyped-call]
Copy link

Copilot AI May 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using 'type: ignore' suppresses type checking for the Trainer instantiation; consider enhancing or contributing improved type stubs for the transformers library to reduce reliance on ignores.

Copilot uses AI. Check for mistakes.
loss_coords = loss_fct(coords_pred, labels_coords)

loss = torch.Tensor(0).to(coords.device)
loss = torch.tensor(0.0).to(coords.device)
Copy link

Copilot AI May 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replacing torch.Tensor(0) with torch.tensor(0.0) makes the intent clear by explicitly creating a float tensor, improving readability and type safety.

Suggested change
loss = torch.tensor(0.0).to(coords.device)
loss = torch.tensor(0.0, dtype=torch.float32).to(coords.device)

Copilot uses AI. Check for mistakes.
):
encoded_inputs = self.pad_coords(
encoded_inputs,
encoded_inputs = self.pad_coords( # type: ignore[assignment]
Copy link

Copilot AI May 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Consider reviewing the need for a type ignore on this assignment in pad_coords to ensure that the underlying types are correctly handled and to maintain long-term code safety.

Copilot uses AI. Check for mistakes.
@a-kore a-kore merged commit c572343 into main May 27, 2025
7 checks passed
@a-kore a-kore deleted the fix_mypy_errors branch May 27, 2025 18:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants