What is event-driven architecture?
Let’s say you fancy some pizza. You call up your local pizzeria and give them your order for your favourite quattro stagioni. Unfortunately they don’t have artichokes, but you can live with that and make the order – but there’s 20 pizza orders before yours, and just one delivery bike. About an hour later you get your stone cold pizza. Disastro!
Let’s fix this with event-driven architecture.
To get things moving faster, we’ll replace the slow delivery bike with an infinite stream of Formula 1 racing cars. The racing cars deliver near instantly, but best of all they also deliver artichokes to the pizzeria. So when you make your order, they get a resupply of fresh artichokes at the same time, and you get all quattro stagioni on a piping hot pizza. And all the other customers get their deliveries at the same time. Everybody’s happy.
How does it work, and where is it useful?
Event-driven architecture breaks everything down into separate microservices that create constant streams of data events (orders.) that can be received and processed (delivered) much faster. This model is perfect for modern data applications because services can plug into the stream and access whatever information they need asynchronously, whenever they need it.
EDA is commonly used in industries such as retail and eCommerce, gaming, social media, IoT, and many others – anywhere where a flexible, responsive, scalable and powerful architecture will be of benefit. Smart, successful businesses everywhere are turning to event-driven architecture to build systems that can handle the high demands of modern data, and are ready for future expansion and growth.
Building blocks of event-driven architecture
Events, at their core, are simply things that happen collected into particular topics. Examples might be an online order, a user ID creation, an action in a video game, a shipping update, an IoT sensor reading. When an event occurs it is emitted; a producer creates a record of it and sends it off to the stream where it’s logged by an event broker (e.g., Aiven for Apache Kafka).
Events might (or might not) be processed in the stream, before they are picked up by interested consumers.
The producer of an event has no knowledge of potential consumers; events are decoupled and asynchronous – they are produced and logged irrespective of whether or not consumers are interested in them.
Why event-driven architecture matters
Building better systems by breaking things apart
Event driven architecture isn’t just for streaming data. It brings added efficiency, flexibility, and security to your systems, making them more maintainable, scalable, and robust.
Monoliths vs. microservices
In traditional monolithic architecture, one database might handle everything, leading to the possibility that a small failure could potentially bring an entire system down.
With event-driven architecture, instead of one massive database, everything is broken into microservices – smaller independent pieces that communicate with each other and pass information around as needed.
By decoupling the producers and consumers of events, events are always available in the stream for any microservices that need them.
Real time event stream processing
Event-driven architecture is perfect for processing and analytics.
Because events are produced as a stream, multiple different consumers can quickly access that information and analyse it, process it, and react to it in real time – or whenever they need to. Consumers can also produce processed data events for consumers further downstream.
It’s also possible to turn static data from more traditional databases into streamed data using Kafka connectors or Change Data Capture (CDC) – so you can get the best of both worlds.
Scalability, efficiency and flexibility
The distributed nature of event-driven architecture helps you build elastic, scalable applications and microservices using your favorite languages and CLIs.
Asynchronicity lets services get on with their work no matter what’s happening elsewhere in the system. During high loads, you can add parallel consumers to take up the strain.
The event stream is the beating heart – the single source of truth for your microservices and applications; a solid, reliable foundation for whatever you build on top.
When JobCloud decided to update their legacy systems to event-driven architecture and Apache Kafka, there was only one choice.
“We’re really, really happy with the approach that we took, and that we chose Aiven as the provider.”
Lead Software Developer at JobCloudRead the full case study
Aiven for Event-Driven Architecture – open source tools for better applications
Aiven offers a range of open source tools that you can combine to create a stable, secure, and responsive event-driven architecture for any application. Our fully managed services allow you to build flexible, scalable, and easily integratable microservices using familiar toolsets.
Scalability – increase (or decrease) your storage, switch providers and regions, and add services at will – and only pay for what you use.
Hassle-free infrastructure – offload your infrastructure concerns to Aiven and focus on implementing your EDA and developing apps.
Familiar toolset – build your EDA solution using the open source tools and languages you already use.
Secure and stable – with 99.99% uptime, full compliance, and backups built in, your systems – and your users’ data – is safe.
Building event-driven architecture – with Aiven
Aiven’s full range of open source tools and services help you build your ideal event-driven architecture. By putting Aiven for Apache Kafka at the core, you can implement a real-time, event streaming microservice platform that’s familiar, flexible, and future-proof.
Apache Kafka forms the beating heart of your event stream, with integrated features such as Kafka REST, Scheme Registry, ACL, Terraform and Kubernetes support
Event sourcing / sink connections
Apache Kafka Connect
With over 20 open source connectors for Apache Kafka Connect, you can bring data into Kafka from numerous popular DBs (including via CDC), or deliver it to OpenSearch and many other sinks.
Apache Kafka MirrorMaker 2
Easily replicate data from one cluster to another, adding a layer of resilience, flexibility, performance and reliability to your event-driven architecture.
Store, search and analyse
- Aiven for M3
- Aiven for Apache Cassandra
- Aiven for OpenSearch
- Aiven for PostgreSQL
- Aiven for MySQL
- Aiven for Redis
- Aiven for InfluxDB
Our supported connectors allow you to deploy PostgreSQL, MySQL or Cassandra as sources, and OpenSearch, Cassandra, InfluxDB and Redis as sinks. You can also connect to OpenSearch Dashboards to visualize your data.
Aiven services for microservices
The beauty of building your event-driven architecture with Aiven is that all your favorite services are already included. It’s easy to integrate everything with your existing systems, and you can even bring your own data along for the ride.
Event-driven architecture for retail
Retail and eCommerce are examples of industries where multiple microservices interact collaboratively as part of a single business workflow.
Some services (orders) create events, and others (payments, shipping) consume them. But the consumers can also process data and create a new event for handling down the line (e.g. sales recommendations).
Asynchronicity ensures that no single service controls the entire process; events can be handled as and when needed by independent services, resulting in a faster and more orchestrated architecture.
Events can be associated with each other through the use of topics – for example, orders, payments, or shipping. In this way it might be possible to combine multiple orders or shipments, and reduce costs.
Maintenance or updating of services is simplified because individual microservices can be updated or modified independently, without taking the entire system offline.
Fraudulent transaction detection and alerting
The power of event-driven architecture lies in event processing – the ability to pick data from the stream on the fly, process it, store it for later, or deliver real-time analytics. These processes are essential for the real-time detection of fraudulent activities.
Aiven for Apache Kafka forms the core of the system, creating an event stream of all customer banking transactions.
Kafka Streams applications grab customer details from PostgreSQL, and identify potential fraudulent activities in real time.
OpenSearch stores fraudulent transactions, M3 creates a transaction history, and notification apps alert customers at risk.
OpenSearch Dashboards used to visualize fraudulent transactions and help identify patterns for further investigation
Bring your own account
We’ve made it easy to bring your existing account over to Aiven for customers using AWS, GCP or Azure
Integrations with Kafka, PostgreSQL, MySQL, OpenSearch and many others for full open source flexibility
High availability included as part of the open source feature set – perfect for EDA applications
Comprehensive selection of major cloud providers and regions – ready for multi cloud deployments
Like everything Aiven does, open source is the beating heart. No vendor lock in
End-to-end encryption, dedicated VMs, full compliance certifications
Increase your servers, storage, or migrate to a different provider at the push of a button, with zero downtime
Keep an eye on your systems with next level observability solutions using Aiven’s services or your own tools
World class support
Customers love our expert 24/7 support, available 365 days a year. We’re there for you when you need us