Release Version 2.10
The content on this site may have changed or moved since you last viewed it. As a result, some of your bookmarks may become obsolete. Therefore, we recommend accessing the latest content via the Hevo Docs website.
To know the list of features and integrations we are working on next, read our Upcoming Features page!
In this Release
New and Changed Features
Sources
-
Faster Data Replication from a MongoDB Source
- Enhanced the replication process for MongoDB Pipelines by eliminating the identification of data types of the Events internally at Hevo, as this process is already handled by MongoDB. This significantly reduces the time required to load Events to the Destination. This change applies to all Pipelines created from Release 2.10 onwards.
-
Improved Handling of Missing records in MongoDB
-
Enhanced the MongoDB integration to ingest the records even if their timestamp exceeds the current ingestion start time. This feature was earlier available on request.
-
-
Reduced Events Usage in Salesforce
- Introduced filtering of object columns to ingest only the required data for your Pipeline. This reduces the number of API calls required and the query timeout errors that may occur due to high volume of data being ingested. This feature is applicable to all new and existing Pipelines. You need to contact Hevo Support with the list of columns that you want to ingest through your Pipeline.
-
Simplified Setup for Facebook Ads
-
Displayed the different custom report Field and Breakdown categories to select from on the Source configuration UI.
Read Custom Reports.
-
Fixes and Improvements
Hevo API
-
Updated the Rate Limits for Accessing Hevo Public APIs
-
Hevo now allows you to make more than 100 API calls without worrying about the rate limit. While the default limit remains 100 requests per minute per user, you can contact Hevo Support to increase the limit as per your requirements.
Read Rate Limits.
-
Documentation Updates
The following pages have been created, enhanced, or removed in Release 2.10:
Data Ingestion
Data Loading
Destinations
Introduction
Pipelines
-
Pipeline FAQs
Sources
-
PostgreSQL FAQs
-
REST API FAQs