- Newest
- Most votes
- Most comments
Hello.
EventBridge has a function called EventBridge Scheduler that allows scheduled execution, and a function called EventBridge Rule that allows you to start processing when a specific event occurs.
For example, by using the EventBridge scheduler, you can run Lambda periodically to achieve things like simple batch processing, and by using EventBridge rules, you can configure settings to notify you when a specific matching rule occurs.
https://docs.aws.amazon.com/scheduler/latest/UserGuide/what-is-scheduler.html
https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-rules.html
I think Kinesis Data Firehose is often used for things like importing logs.
I often use Kinesis Data Firehose to migrate logs from CloudWatch Logs to S3.
Since Kinesis Data Firehose itself supports simple conversion processing, it is also possible to convert the sent data and then save it to S3.
https://docs.aws.amazon.com/firehose/latest/dev/record-format-conversion.html
KDF: Used to easily Capture, transform & load streaming data into S3, opensearch service & splunk. Amazon Kinesis Data Firehose only supports Amazon S3, Amazon Redshift, Amazon OpenSearch Service, and an HTTP endpoint as the destination.
Eventbridge: Schedule: Cron jobs (scheduled scripts) Event rules to react to a service doing something A rule that runs when an event matches the defined event pattern. EventBridge sends the event to the specified target.
I believe it's useful to look at what the services offer to choose between them.
Amazon Data Firehose allows you to send real-time data to destinations such as Amazon S3, Amazon Redshift, HTTP endpoints including ISV solutions and other services. The service also provides some functionality to transform and partition data inline with the JQ engine, or to use a custom Lambda function or other data types and formats. The data stream can come from Kinesis Data Stream, AWS IoT, Amazon MSK, and some others.
Amazon EventBridge at its core connects (event-driven) applications together by providing an event bus for messages. This includes features to route events based on data fields to many destinations (called targets) using rules. EventBridge also includes Pipes which take events from one source to one target (like Data Firehose) with more advanced transformations. Pipes are most commonly use to enrich incoming data before sending them to a bus to be delivered to many targets. EventBridge sources could many AWS services that emit events (creations, deletions, updates, etc.), custom applications or SaaS applications using the SDK. The targets could also be other applications, AWS Services and more. There are many more features to EventBridge that I haven't covered like schema discovery, registries and schedulers, but this should give you a basic outline.
Kinesis is more apt to large amounts of data streaming and processing, think click-stream, logs and other data where as EventBridge focuses in integrating applications and routing events this re:Invent talk will give you some patterns.
Relevant content
- Accepted Answerasked a year ago
- AWS OFFICIALUpdated 9 months ago
- AWS OFFICIALUpdated 9 months ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 2 years ago