Share

Release Version 2.48

The content on this site may have changed or moved since you last viewed it. As a result, some of your bookmarks may become obsolete. Therefore, we recommend accessing the latest content via the Hevo Docs website.

This release note also includes the fixes provided in all the minor releases since 2.47.

To know the complete list of features available for early adoption before these are made generally available to all customers, read our Early Access page.

In this Release


New and Changed Features

Destinations

  • Support for using User-Provided S3 Buckets and Snowflake Integration to Stage Data (Added in Release 2.47.1)

    • Introduced support for configuring Hevo to use your own S3 bucket and Snowflake Storage integration to stage data before loading it into a Snowflake Destination. This is useful if your Snowflake account has organization-level or account-level security controls enabled.

User Experience

  • Capturing User Skipped Events in Activity Logs (Added in Release 2.47.1)

    • Hevo now generates an activity log entry when a user manually skips a failed Event. This makes it easy to distinguish Events skipped manually from those automatically skipped by Hevo during ingestion, giving you complete visibility into Pipeline activity.

Fixes and Improvements

Sources

  • Handling Data Mismatch for Newly Included Objects in Log-based Pipelines (Fixed in Release 2.47.3)

    • Fixed an issue where records were missing at the Destination for objects included in a log-based Pipeline while an ingestion run was already in progress. This issue occurred because the incremental job loaded the object list only at startup and did not pick up newly included objects mid-run, causing all changes to those objects during that window to be missed.

      With this fix, Hevo monitors for changes in the object list during a run. When a newly included object is detected, Hevo restarts the ingestion job to fetch the data without any loss. There may be a brief pause in data ingestion when the job restarts to pick up newly included objects. The duration of the pause depends on the volume of new objects. This fix applies to all new and existing log-based Pipelines created with PostgreSQL, MySQL, and MongoDB as the Source.

  • Handling Duplicate Data Ingestion Issue in NetSuite SuiteAnalytics (Fixed in Release 2.47.3)

    • Fixed an issue where the same records were repeatedly ingested during incremental loads, resulting in increased Event quota consumption. This issue occurred because the query window was not advanced after each successful fetch, causing the same time range to be reprocessed continuously.

      With this fix, Hevo now correctly updates the query window after each cycle, ensuring subsequent loads fetch only new records. This fix applies to all new and existing Pipelines.

  • Handling Historical Ingestion Failure in NetSuite ERP

    • Fixed an issue where the historical load for the Transaction object remained stuck and failed repeatedly during ingestion. This issue occurred when a page in the NetSuite API response contained only custom transaction records, which Hevo does not support. In such a case, the response returned an empty record list, blocking ingestion of all subsequent standard transaction records.

      With this fix, Hevo skips pages with empty record lists and continues ingesting standard transaction records without interruption. Custom transaction records are not supported and will continue to be skipped. Existing Pipelines affected by this issue will resume automatically. No user action is required.

  • Handling Missing Data for Tasks Object in Asana

    • Fixed an issue where the configured value for the enum_value field was not replicated to the Destination. This issue occurred because Hevo ingested the value for this field but did not parse it during data loading, resulting in missing data.

      With this fix, Hevo now correctly parses and replicates the enum_value field to the Destination. This fix applies to all new and existing Pipelines.

      Note: For existing Pipelines, you must restart the historical load for the Tasks object to replicate the missing data to the Destination.

Documentation Updates

The following pages have been created, enhanced, or removed in Release 2.48:

Data Ingestion

Pipelines

Sources

Destinations

Last updated on May 04, 2026

Tell us what went wrong

Skip to the section