Skip to content

Conversation

@wk989898
Copy link
Collaborator

@wk989898 wk989898 commented Jan 9, 2026

What problem does this PR solve?

Issue Number: ref #1087

What is changed and how it works?

  • Bug Fix: Canal JSON Decoder Time Precision: Corrected an issue in the Canal JSON decoder where time-related column values (Date, Datetime, Timestamp) were being formatted with incorrect fractional seconds precision. The fix ensures tiTypes.MaxFsp is used for accurate precision handling.
  • Enhanced Test Coverage for Canal JSON Codec: Introduced a new comprehensive test file (canal_json_encoder_test.go) covering end-to-end scenarios for DML, DDL, and checkpoint events, including compression, claim check, large message handling (key-only), and partitioned table support.
  • New Tests for Canal JSON Transaction Encoder: Added a new test file (canal_json_txn_encoder_test.go) to validate the Canal JSON transaction event encoder, specifically testing max message bytes and callback functionality.
  • Refinement of Open Protocol Encoder Logic: Modified the Open Protocol encoder (codec.go) to conditionally output 'old' column values based on config.OpenOutputOldValue. It also includes improved null and virtual column checks, corrected flag usage, and refined binary literal handling.
  • Improved Value Encoding in Simple Protocol: The encodeValue function in the Simple Protocol codec (message.go) now includes more specific type handling for various Go and TiDB types (e.g., int64, uint64, float32, float64, string, []byte with base64 for binary, types.VectorFloat32) to ensure accurate string representation, moving away from a generic fmt.Sprintf.
  • Expanded Open Protocol Codec Tests: New tests (codec_test.go and message_test.go) were added for the Open Protocol codec to verify OnlyOutputUpdatedColumns and DeleteOnlyHandleKeyColumns configurations, as well as robust encoding/decoding of messageRow structures with different data types.

Check List

Tests

  • Unit test
  • Integration test
  • Manual test (add detailed scripts or steps below)
  • No code

Questions

Will it cause performance regression or break compatibility?
Do you need to update user documentation, design documentation or monitoring documentation?

Release note

Please refer to [Release Notes Language Style Guide](https://pingcap.github.io/tidb-dev-guide/contribute-to-tidb/release-notes-style-guide.html) to write a quality release note.

If you don't think this PR needs a release note then fill it with `None`.

@ti-chi-bot ti-chi-bot bot added the release-note Denotes a PR that will be considered when it comes time to generate release notes. label Jan 9, 2026
@ti-chi-bot
Copy link

ti-chi-bot bot commented Jan 9, 2026

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please assign 3aceshowhand for approval. For more information see the Code Review Process.
Please ensure that each of them provides their approval before proceeding.

The full list of commands accepted by this bot can be found here.

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@wk989898
Copy link
Collaborator Author

wk989898 commented Jan 9, 2026

/test all

@ti-chi-bot ti-chi-bot bot added the size/XXL Denotes a PR that changes 1000+ lines, ignoring generated files. label Jan 9, 2026
@gemini-code-assist
Copy link

Summary of Changes

Hello @wk989898, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request focuses on improving the robustness and correctness of data encoding across various Kafka output codecs, particularly Canal JSON, Open Protocol, and Simple Protocol. It addresses a specific bug in Canal JSON's time value formatting and significantly expands test coverage for both Canal JSON and Open Protocol. Additionally, it refines the logic for handling old values and column encoding to ensure more accurate and configurable data representation.

Highlights

  • Bug Fix: Canal JSON Decoder Time Precision: Corrected an issue in the Canal JSON decoder where time-related column values (Date, Datetime, Timestamp) were being formatted with incorrect fractional seconds precision. The fix ensures tiTypes.MaxFsp is used for accurate precision handling.
  • Enhanced Test Coverage for Canal JSON Codec: Introduced a new comprehensive test file (canal_json_encoder_test.go) covering end-to-end scenarios for DML, DDL, and checkpoint events, including compression, claim check, large message handling (key-only), and partitioned table support.
  • New Tests for Canal JSON Transaction Encoder: Added a new test file (canal_json_txn_encoder_test.go) to validate the Canal JSON transaction event encoder, specifically testing max message bytes and callback functionality.
  • Refinement of Open Protocol Encoder Logic: Modified the Open Protocol encoder (codec.go) to conditionally output 'old' column values based on config.OpenOutputOldValue. It also includes improved null and virtual column checks, corrected flag usage, and refined binary literal handling.
  • Improved Value Encoding in Simple Protocol: The encodeValue function in the Simple Protocol codec (message.go) now includes more specific type handling for various Go and TiDB types (e.g., int64, uint64, float32, float64, string, []byte with base64 for binary, types.VectorFloat32) to ensure accurate string representation, moving away from a generic fmt.Sprintf.
  • Expanded Open Protocol Codec Tests: New tests (codec_test.go and message_test.go) were added for the Open Protocol codec to verify OnlyOutputUpdatedColumns and DeleteOnlyHandleKeyColumns configurations, as well as robust encoding/decoding of messageRow structures with different data types.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a fix for incorrect column value encoding for Kafka output, particularly for time types in the Canal JSON decoder and various data types in the simple protocol encoder. It also includes several important bug fixes and refactorings in the open-protocol codec, such as correcting the retrieval of previous row data and using the correct column flags. A comprehensive suite of new tests has been added for the Canal JSON encoder/decoder, which significantly improves test coverage and is a great addition. The changes are well-implemented and improve the correctness and robustness of the codecs.

require.True(t, ok)

decodedLargeEvent := decoder.NextDMLEvent()

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This assertion is a duplicate of the one on the previous line. It should be removed to improve code clarity and avoid redundancy.

@wk989898
Copy link
Collaborator Author

/test all

2 similar comments
@wk989898
Copy link
Collaborator Author

/test all

@wk989898
Copy link
Collaborator Author

/test all

@wk989898
Copy link
Collaborator Author

/test all

@ti-chi-bot
Copy link

ti-chi-bot bot commented Jan 15, 2026

@wk989898: The following test failed, say /retest to rerun all failed tests or /retest-required to rerun all mandatory failed tests:

Test name Commit Details Required Rerun command
pull-error-log-review 2b23daf link true /test pull-error-log-review

Full PR test history. Your PR dashboard.

Details

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

release-note Denotes a PR that will be considered when it comes time to generate release notes. size/XXL Denotes a PR that changes 1000+ lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant