In 2006, SoapUI was developed with a singular goal: create a simple, open‑source SOAP API testing tool.
Since then, developers have contributed code and provided valuable feedback to help SmartBear transform SoapUI into ReadyAPI, the most powerful API testing platform on the market.
From our humble beginnings, we have been driven to turn ReadyAPI into a testing platform that supports every specification and protocol, so that no tester or developer gets left behind when it comes to delivering quality applications. From legacy SOAP services, to microservices powered by mainstream REST services, to cutting-edge IoT use cases leveraging MQTT. ReadyAPI will ensure quality, performance, and security in all of your APIs regardless of type.
With that vision in mind, ReadyAPI is proud to expand our support of Event-Driven Architectures (EDA) with our launch of Apache Kafka testing, and a new protocol-agnostic testing experience. So, let’s dive into some of the reasons why we support EDA and what’s new in ReadyAPI.
Why Event-Driven Architecture
Traditionally, most systems operate in what you could think of as the data-centric model where the data is the source of truth. The era of big data made us realize that while having lots of data is useful, things can get slower the more you have. Over the past few years, there’s been a movement from focusing on data at rest (service-oriented architecture), to focusing on events (event-driven architecture).
Organizations are now moving away from accumulating data and data lakes, and more to focusing on data in-flight and keeping track of it while it's moving place to place. The shift to event-driven architecture means moving from a data-centric model to an event-centric model. In the event-driven model, data is still important, but the events become the most important component.
Events are a fundamental aspect that drives this ecosystem; an organization is but a cog that, when viewed at any level, needs to act and react to internal and external events.
An event is defined as a change of state of some key business system. For instance, somebody buys a product, someone else checks in for a flight, or a bus is arriving late somewhere. And if you think about it, events exist everywhere, constantly, no matter what industry.
The value of events is that a sequence of related events represent behavior (e.g., an item was added then removed from a shopping cart, an error recurs every 24 hours or users always click through a site in a particular order). A sequence of related events is commonly called a stream. Streams can come from an IoT device, a user visiting a website, changes to a database, or many other sources.
With event-driven architecture, when an event notification is sent, the system captures what happened (like a change in state has occurred) and waits to send the reply to whoever/whenever they request it. The application that received that message can either respond, or wait to respond until the change in state has occurred.
Applications built around an event-driven architecture enable more agile, scalable, contextual, and responsive digital business applications; which is why this architecture approach has been exploding.
Check out this event-driven architecture demo to learn more.
Why Apache Kafka
Now, to support this new streaming data paradigm, additional technologies are needed. One of the most popular tools for working with streaming data is Apache Kafka. Apache Kafka is an open‑source distributed event-streaming platform used by thousands of companies. Think high performance data pipelines, streaming analytics, data integration, and mission-critical applications. Kafka allows you to build real-time streaming applications, and then those react to streams to do real-time data analytics, react, aggregate, join real-time data flows, and perform complex event processing.
Kafka is so popular because it’s easy to setup and use which is why many large companies who handle a lot of data use Kafka. So how does it work?
When a client application sends a request to a RESTful, GraphQL, SOAP, or other synchronous API, it waits for a response. When you work with asynchronous APIs, the client does not need a response, and the server does not send it. Such APIs implement the event-driven architecture.
A typical process looks like this:
- A publisher (or producer, in Kafka terms) posts an event (sends a request) to a broker. An event is a piece of data to transmit. It can be a JSON object containing information about a transaction or details of a newly created object in a database, and so on.
To separate one type of events from another, a producer sends it to one of the channels (or topics) specified within a broker.
- If another application needs events related to some topic, it subscribes to the needed channel. Once a publisher posts an event to that channel, the subscriber (or consumer) receives the event and can work with it as it is needed by a business logic:
Here’s an example: Imagine you have a web store. When a user logs in, a client application posts a login event. When a user creates an order, the client application posts another event with the order details. The client application does not bother about it anymore. In its time, some internal application can get the order’s data from the broker and handle it in some way. For example, store it in a database or send you an email.
So How Does ReadyAPI Support Apache Kafka Testing?
By making it easy! The current set of testing tools for Kafka Streams is cumbersome, complicated, and time-consuming. They require what can seem like endless coding. With ReadyAPI, you can create test assertions to verify your Kafka endpoints with an easy point-and-click interface.
All ReadyAPI assertions have been updated to support the addition of Assertion Groups. Assertion Groups enable users to add complex assertion logic without any scripting or code.
Add/Import Kafka APIs
Add your Kafka Brokers and Topics to ReadyAPI via our protocol-agnostic creation tool. Teams using AsyncAPI specifications to document their Kafka services can now add these services to ReadyAPI with a single click.
Create Kafka Tests
Our protocol-agnostic testing experience makes it easy to launch Kafka tests. Produce and consume Kafka messages, headers, partitions, and keys to a broker and topic of your choice. Populate your published message with dynamic data from databases, scripts, or other APIs. Validate your received messages and metadata against your suite of assertions.
In a Phrase: Operational Simplicity
Kafka is open-source and used by thousands of companies every day. And more on the way. While with ReadyAPI, SmartBear remains committed to improving testing productivity and accelerating delivery of high-quality APIs regardless of type.
With the natural addition of Apache Kafka support, we continue our commitment to delivering the most robust protocol-agnostic API testing tool on the market.
To Learn more check out our Apache Kafka Testing Demonstration or go to our documentation page.