Skip to Main Content
Integration


This is an IBM Automation portal for Integration products. To view all of your ideas submitted to IBM, create and manage groups of Ideas, or create an idea explicitly set to be either visible by all (public) or visible only to you and IBM (private), use the IBM Unified Ideas Portal (https://ideas.ibm.com).


Shape the future of IBM!

We invite you to shape the future of IBM, including product roadmaps, by submitting ideas that matter to you the most. Here's how it works:

Search existing ideas

Start by searching and reviewing ideas and requests to enhance a product or service. Take a look at ideas others have posted, and add a comment, vote, or subscribe to updates on them if they matter to you. If you can't find what you are looking for,

Post your ideas
  1. Post an idea.

  2. Get feedback from the IBM team and other customers to refine your idea.

  3. Follow the idea through the IBM Ideas process.


Specific links you will want to bookmark for future use

Welcome to the IBM Ideas Portal (https://www.ibm.com/ideas) - Use this site to find out additional information and details about the IBM Ideas process and statuses.

IBM Unified Ideas Portal (https://ideas.ibm.com) - Use this site to view all of your ideas, create new ideas for any IBM product, or search for ideas across all of IBM.

ideasibm@us.ibm.com - Use this email to suggest enhancements to the Ideas process or request help from IBM for submitting your Ideas.


Status Not under consideration
Workspace App Connect
Created by Guest
Created on May 22, 2024

Improved kafka avro support

Our specific use case is related to facilitate interaction with serialised Avro on Confluent, but any kind of improvement in that area would be welcome. At the moment almost everything we do to requires custom logic in java compute nodes, turning ACE into a dumb wrapper for java code, which is less than desirable. 

Support for Avro schema in a similar vein to what is available for XSD. Having a direct link to the schema registry would be nice, but honestly, not as important. Based on those, conversion from/to either a new Avro parser or just the regular Json one. And then being able to link a topic to the current schema, so it knows how to serialise/deserialise the message tree. Some support for magic bytes might be required here, but seeing as different vendors have different solutions for them, I would settle for being able to just provide a setting to denote how much has to be trimmed from the front. More fine grained control on the commit when acting as a consumer would be nice, as the node currently commits on propagation and you have to manually build in transactionality through other means. On the schema lifecycle, I don't expect a fully managed solution around this. Honestly, it can already be worked around with a byte stream reader and interpreting the magic bytes and routing, but maybe a slightly more OOTB way of working might be useful. For example, if the consumer node can propagate some info on the magic bytes in the local environment or message properties, that would make life easier. With XSD, generation of java classes with Jaxb is provided out of the box and the runtime comes with Jackson, so something similar for Avro would be greatly appreciated. I assume most of these will already be on their radar, but regardless, these are some of the features we could really use.

Idea priority Medium
  • Admin
    Ben Thompson
    Reply
    |
    Nov 19, 2024

    Thank you for taking the time to raise this idea. We agree with the concept. Our current thinking is that the JSON logical tree structure within the message flow will be sufficient to give the flow developer a structure to interact with, and then handle the Avro serialisation / deserialisation as a step exposed through the Kafka node properties and Kafka policy. We will track this through idea number 663 (which pre-existed and carries a larger number of votes), so closing this one off as a duplicate. Meanwhile status of 663 has moved to Planned for Future Release.