Known issues and workarounds for Data Bridge
The following is a list of the most critical known issues in the current release of NexJ Data Bridge and their workarounds, where possible.
DATABRIDGE-718
Using the "equals today" operator with filters on date attributes results in an error.
DATABRIDGE-811
When you create a new Contacts view, add the Full Name and Related Opportunities attributes, enable streaming, save the view, and then create a new opportunity or delete one in NexJ CRM, instead of publishing a single update message for the related contact to Kafka, Data Bridge produces two update messages (one for the related contact and one for the related user).
Workaround:
A possible workaround is to add a filter "Type equals Contact" to such a view.
DATABRIDGE-822
When you create a view on the Activities subject area with streaming enabled in Data Bridge, create a task with a follow-up action item in NexJ CRM, and delete the main task, the action should result in a "delete" event for the task published by Data Bridge. Currently, the action results in an "update" event on the task.
DATABRIDGE-974
When the object count is refreshed in the Preview tab for a Data Bridge view that contains associated objects (for example, Addresses for a Company), and the Changes (primary or associated fields) option is selected in the Publishing Options tab, the displayed count may be higher than the actual number of objects that would be published. This defect only affects the preview functionality and not the accuracy of data being published.
DATABRIDGE-1150
When a value picker filter is added for an association attribute (for example, Contacts > Tier) and then edited more than once, the filter becomes not editable. In order to continue editing the filter, the view itself must be saved and edited.
DATABRIDGE-1565
In some cases, if an error occurs during event steaming of NexJ CRM updates to a Kafka topic, and when the update was triggered by an association attribute included in the view, the "Retry failed updates" function may not work as expected.
DATABRIDGE-1628
Updating some properties (for example, Channel Name or Topic Name) of a data source after it has been enabled to receive streaming updates through a Generic View, does not take effect.
Workaround:
Remove the data source and then re-add it to the view configuration.
DATABRIDGE-1630
On the Create/Edit Data Source dialog, if you save a data source configuration where the channel name is refers to a channel that does not exist, an unexpected error message is displayed.
DATABRIDGE-1965
If a Generic Publishing Process is configured to imported updates from a Kafka topic, and the Data Bridge Kafka Consumer is later restarted, the streaming of inbound updates may be impacted.
Workaround:
Create a new consumer group. (First, edit the Generic Publishing Process to disable streaming and save your changes. Then, edit the process again and enable streaming.)
DATABRIDGE-1996
When a snapshot that contains currency or percentage fields is published to a Kafka Avro or Kafka JSON publishing target, these field values are exported as string values.
DATABRIDGE-1998
Imported Avro Schemas that contain attributes with union types (for example, "type": ["null", "string"],) cannot be used in Transformations.
DATABRIDGE-2009
The f:cast function fails to convert a literal string representation of a timestamp to a timestamp value.