Batching Webhooks

Wondering if the discussion regarding a batching operation on Webhook’s came up in discovery while you guys designed it. It’s not uncommon for event service to get log jammed with multiple events of the same topology. IE, a user went to a page, selected 100 tasks, and changed the status. You would expect all the Webhook’s to be sent at roughly the same time. Having a batching attribute when configuring the hook would certainly clean up the event log. Another bonus to having batch attributes and operations would be simplifying Serverless integrations and lowering overall cold start times. I’d gladly wait a few extra seconds for my Webhook to process a batch update on 100 tasks rather than wait cold start times on 100 separate lambda jobs. Also, complexity becomes an issue. When you flood Lamdba with this request, you are going to probably need concurrent provisioning and an SQS queue to burst appropriately. As a result, the democratization of Serverless is lost, and your time, effort, and costs to implement skyrocket. Certainly, I can see scenarios where the jobs would need to be separated due to compute times of lambda. I may have jumped the curve with this discussion, but hoping you guys can catch up with me. This could be the type of feature that makes Webhooks and Serverless from SG more accessible to the masses.

4 Likes

Hi Romey,

It is very useful. I would ask our experts if there is anything could share.

Loney

2 Likes

Thinking about this a bit more. If SG is unable to introduce batching operations, maybe I can configure our AWS Webhook broker with a Step function to gather all the request and reprocess them in bulk, rather than one at a time. However, my concern here is that I use the session_uuid to update the web site when the changes happen. I fear that if I batch process all the event, I would lose the ability to auto update the page. Thoughts?

Hi Romey,

Wanted to say thanks for requesting this, and following up with more detail recently.
We’re still in vetting phase, but we’ve made some promising progress on offering batched event payloads.
A quick outline of some current considerations:

  • Each webhook would have an additional config option checkbox, to “Allow batched deliveries”.
  • Format for batched delivery payload JSON structure, accounting for payload size limitations.
  • Limit for events per delivery payload.
  • Possible increase for HTTP request timeout (currently 6 seconds for all HTTP request), dependent on number of events included.
  • Throttling algorithm accommodations for mix of batched & non-batched webhook configurations.

During this phase, we’d welcome perspectives from anyone in this community in this thread.

Hi Zoe,

Thanks for the follow up. It felt like my request went into the abyss. I’d certainly like to test this first hand. I am guessing the current functionality is not in a releasable state. However, per your notes I think it’s on track. It would be very helpful if you could share some example responses and or payloads here so we can better visualize what the resulting payload would look like. Having the flexibility to determine the size of the chunks would be great. Of note we use a notification platform call Sentry.io to help with error detection and notifications from AWS. It’s pretty awesome. It would be great if Webhooks gave us the ability to inject our own notification service like Sentry. This seems particularly relevant here. What happens if we accidentally make the size/number of events to large. Is the issue with SG sending batches to AWS or is it with AWS.

Thank you for sharing some progress on the board. We are eager and need this functionality to assist with large event based operations. Thanks you.

Romey

1 Like