Event-Driven Architecture in Manufacturing Quality Control
Event-driven software architecture has been used in software development for a long time. Particularly in finance, where processing real-time information from multiple sources is critical for gaining a competitive advantage or preventing fraud. The manufacturing industry is still primarily relying on monolithic software architecture; however, it is rapidly catching up.
This article will discuss how event-driven architecture (EDA) can be used to address issues manufacturers face today with measurement data collection and ways to prepare for a ten-fold increase of data in the coming years.
What is Event-Driven Architecture and How Does It Differ from Monolithic Architecture?
Event-driven architecture is a way of designing software systems that uses events as way of communication between different components, rather than making direct function calls or requests.
I other words, system components either announce some event happened (like a button click) or listen to those events and perform action upon receiving an event.
In contrast, monolithic software architecture typically follows a request-response pattern, where one component makes a request to another component to perform a specific action, resulting in a tightly coupled relationship that makes it difficult to scale and modify the system.
How Does Event-Driven Architecture Improve Scalability?
In monolithic architecture, all components are tightly integrated, making it difficult to scale specific parts of the system as data volume increases. With event-driven architecture, adding more “horsepower” is as simple as adding more worker services that can listen to events and scale independently.
What Tools Can Help Make a System More Event-Driven?
Opinionated choice - Apache Kafka. It is a free, open-source messaging system that acts as a messaging hub for transmitting and receiving messages between microservices. It has become very popular in recent years and most Fortune 500 companies rely on it.
Check out the video of one of the Apache Kafka cofounders:
A Real-World Example of Event-Driven Architecture in Manufacturing Quality Control
Consider a manufacturing company that uses metrology software to perform quality control measurements. The measurements are regularly exported to an XML file format. There will be a microservice to monitor a location where these XML files are stored. When new file arrives, microservice will process it and send the measurement data to Apache Kafka. This microservice acts as a producer of events. Another microservice, acting as a consumer of events, listens to the events produced by the first microservice, processes the measurement data, and saves it in a database for further analysis.
As the number of measurement devices increases, the system can easily scale to handle the increased data volume by adding more instances of the microservice that processes the XML files. If the database experiences a high volume of writes, another microservice can be added to perform calculations and caching.
Each microservice operates independently, so adding more instances does not affect the performance or scalability of the other microservices in the system.
Are There Any Microservices I Can Use to Get Started?
Lucky you! We have developed an open-source microservices to extract features, characteristics, and measurements from XML files generated by GOM Inspect. These include a Kafka producer that parses the XML and sends it to Kafka and a consumer service that saves the data to the database. You can find these microservices on GitHub at https://github.com/KensoBI/gomxml-kafka.
In summary, event-driven architecture with Apache Kafka and microservices is an effective solution for collecting and managing quality control measurements. This approach offers a scalable, flexible, and real-time data processing solution, empowering businesses to effectively handle growth as the volume of measurements increases over time.