Share

MySQL

Hevo supports the following variations of MySQL as a Source:

Refer to the required MySQL variant for the steps to configure it as a Source in your Hevo Pipeline and start ingesting data.


Enhancements to Log-based Pipelines

In Release 2.17, Hevo enhanced the functionality of log-based Pipelines created with any variant of the MySQL Source. These enhancements include:

  • Handling ENUM values and indexes: ENUM definitions from the Source tables are now correctly interpreted to ensure that the accurate values are loaded into the Destination tables.

  • Handling interrupted transactions: Hevo efficiently handles scenarios where the metadata required to read the table structure from the BinLog is missing. This prevents unnecessary reprocessing of the entire log file and ensures faster and reliable data replication.

  • Handling JSONB and BINARY data types: Improved support for JSONB and BINARY data types ensures reliable ingestion from the Source tables that use these data types.

  • Schema retention for skipped objects: Hevo retains schema definitions for all Source objects, even those initially skipped when the Pipeline was created. If such objects are included post-Pipeline creation, data ingestion begins using their current schema. This prevents schema mismatches.

  • Parsing Data Definition Language (DDL) statements: A more reliable DDL parser is used to read schema changes from the Source database, such as adding, removing, or renaming columns. These changes are then applied to the corresponding tables at the Destination.
    Data ingestion begins only after the Destination table structure is fully aligned with the Source, as recorded in the BinLog during ingestion. This prevents Pipeline failures caused by schema mismatches.

  • Validating BinLog settings during Test Connection: During Source setup, Hevo checks critical BinLog configurations, including binary log format, the type of row image recorded in the log file, and the BinLog retention period. This helps to identify misconfigurations early in the setup process, ensuring smoother Pipeline creation and preventing ingestion issues.

Migrating Existing Log-based Pipelines

If you are using a MySQL log-based Pipeline created before Release 2.17, you can take advantage of the latest enhancements using one of the following methods:

  • Create a new Pipeline while retaining your existing Destination.

    1. Pause the existing Pipeline.

    2. Create a log-based Pipeline using the same Source and Destination configurations.

    3. Specify the same Destination table prefix as the paused Pipeline.

    4. Wait for the new Pipeline to complete ingesting historical data for all selected Source objects.

    5. Delete the old Pipeline.

  • Contact Hevo Support to migrate your existing Pipeline to use the enhanced BinLog functionality.


Handling the Unsigned Data Type

While converting the data types of the ingested data prior to replication to the Destination, some values may be truncated if they exceed the allowed limit, or the Event might fail if the value mismatches with the schema. This can happen in the MySQL database with the unsigned int type since it can hold more positive values than the signed int. For example, the 8-bit unsigned int holds 256 values, from 0 to 255, while the 8-bit signed int also holds 256 values but they range from -127 to 127. Thus, the positive range of the unsigned int (0-255) will be greater than the positive range of the signed int (0-127).

Hence, to avoid data loss, Hevo automatically updates the unsigned int to a larger signed int type. For example, an 8-bit unsigned int is promoted to a 16-bit signed int as it has more space to store the value.


See Also

Last updated on Jul 30, 2025

Tell us what went wrong

Skip to the section