Spring REST Docs versus SpringFox Swagger for API documentation

Recently, I have come across some articles and mentions about Spring REST Docs, where it has been present as a better alternative to traditional Swagger docs. Until now, I was always using Swagger for building API documentation, so I decided to try Spring REST Docs. You may even read on the main page of that Spring project (https://spring.io/projects/spring-restdocs) some references to Swagger, for example: “This approach frees you from the limitations of the documentation produced by tools like Swagger”. Are you interested in building API documentation using Spring REST Docs? Let’s take a closer look on that project!

A first difference in comparison to Swagger is a test-driven approach to generating API documentation. Thanks to that Spring REST Docs ensures that the documentation is always generated accurately matches the actual behavior of the API. When using Swagger SpringFox library you just need to enable it for the project and provide some configuration to force it work following your expectations. I have already described usage of Swagger 2 for automated build API documentation for Spring Boot based application in my two previous articles:

The articles mentioned above describe in the details how to use SpringFox Swagger in your Spring Boot application to automatically generate API documentation basing on the source code. Here I’ll give you only a short introduction to that technology, to easily find out differences between usage of Swagger2 and Spring REST Docs.

1. Using Swagger2 with Spring Boot

To enable SpringFox library for your application you need to include the following dependencies to pom.xml.

<dependency>
    <groupId>io.springfox</groupId>
    <artifactId>springfox-swagger2</artifactId>
    <version>2.9.2</version>
</dependency>
<dependency>
    <groupId>io.springfox</groupId>
    <artifactId>springfox-swagger-ui</artifactId>
    <version>2.9.2</version>
</dependency>

Then you should annotate the main or configuration class with @EnableSwagger2. You can also customize the behaviour of SpringFox library by declaring Docket bean.

@Bean
public Docket swaggerEmployeeApi() {
	return new Docket(DocumentationType.SWAGGER_2)
		.select()
			.apis(RequestHandlerSelectors.basePackage("pl.piomin.services.employee.controller"))
			.paths(PathSelectors.any())
		.build()
		.apiInfo(new ApiInfoBuilder().version("1.0").title("Employee API").description("Documentation Employee API v1.0").build());
}

Now, after running the application the documentation is available under context path /v2/api-docs. You can also display it in your web browser using Swagger UI available at site /swagger-ui.html.

spring-cloud-3
It looks easy? Let’s see how to do this with Spring REST Docs.

2. Using Asciidoctor with Spring Boot

There are some other differences between Spring REST Docs and SpringFox Swagger. By default, Spring REST Docs uses Asciidoctor. Asciidoctor processes plain text and produces HTML, styled and layed out to suit your needs. If you prefer, Spring REST Docs can also be configured to use Markdown. This really distinguished it from Swagger, which uses its own notation called OpenAPI Specification.
Spring REST Docs makes use of snippets produced by tests written with Spring MVC’s test framework, Spring WebFlux’s WebTestClient or REST Assured 3. I’ll show you an example based on Spring MVC.
I suggest you begin from creating base Asciidoc file. It should be placed in src/main/asciidoc directory in your application source code. I don’t know if you are familiar with Asciidoctor notation, but it is really intuitive. The sample visible below shows two important things. First we’ll display the version of the project taken from pom.xml. Then we’ll include the snippets generated during JUnit tests by declaring macro called operation containing document name and list of snippets. We can choose between such snippets like curl-request, http-request, http-response, httpie-request, links, request-body, request-fields, response-body, response-fields or path-parameters. The document name is determined by name of the test method in our JUnit test class.

= RESTful Employee API Specification
{project-version}
:doctype: book

== Add a new person

A `POST` request is used to add a new person

operation::add-person[snippets='http-request,request-fields,http-response']

== Find a person by id

A `GET` request is used to find a new person by id

operation::find-person-by-id[snippets='http-request,path-parameters,http-response,response-fields']

The source code fragment with Asciidoc natation is just a template. We would like to generate HTML file, which prettily displays all our automatically generated staff. To achieve it we should enable plugin asciidoctor-maven-plugin in the project’s pom.xml. In order to display Maven project version we need to pass it to the Asciidoc plugin configuration attributes. We also need to spring-restdocs-asciidoctor dependency to that plugin.

<plugin>
	<groupId>org.asciidoctor</groupId>
	<artifactId>asciidoctor-maven-plugin</artifactId>
	<version>1.5.6</version>
	<executions>
		<execution>
			<id>generate-docs</id>
			<phase>prepare-package</phase>
			<goals>
				<goal>process-asciidoc</goal>
			</goals>
			<configuration>
				<backend>html</backend>
				<doctype>book</doctype>
				<attributes>
					<project-version>${project.version}</project-version>
				</attributes>
			</configuration>
		</execution>
	</executions>
	<dependencies>
		<dependency>
			<groupId>org.springframework.restdocs</groupId>
			<artifactId>spring-restdocs-asciidoctor</artifactId>
			<version>2.0.0.RELEASE</version>
		</dependency>
	</dependencies>
</plugin>

Ok, the documentation is automatically generated during Maven build from our api.adoc file located inside src/main/asciidoc directory. But we still need to develop JUnit API tests that automatically generate required snippets. Let’s do that in the next step.

3. Generating snippets for Spring MVC

First, we should enable Spring REST Docs for our project. To achieve it we have to include the following dependency.

<dependency>
	<groupId>org.springframework.restdocs</groupId>
	<artifactId>spring-restdocs-mockmvc</artifactId>
	<scope>test</scope>
</dependency>

Now, all we need to do is to implement JUnit tests. Spring Boot provides an @AutoConfigureRestDocs annotation that allows you to leverage Spring REST Docs in your tests.
In fact, we need to prepare standard Spring MVC test using MockMvc bean. I also mocked some methods implemented by EmployeeRepository. Then, I used some static methods provided by Spring REST Docs with support for generating documentation of request and response payloads. First of those method is document("{method-name}/",...), which is responsible for generating snippets under directory target/generated-snippets/{method-name}, where method name is the name of the test method formatted using kebab-case. I have described all the JSON fields in the requests using requestFields(...) and responseFields(...) methods.

@RunWith(SpringRunner.class)
@WebMvcTest(EmployeeController.class)
@AutoConfigureRestDocs
public class EmployeeControllerTest {

	@MockBean
	EmployeeRepository repository;
	@Autowired
	MockMvc mockMvc;
	
	private ObjectMapper mapper = new ObjectMapper();

	@Before
	public void setUp() {
		Employee e = new Employee(1L, 1L, "John Smith", 33, "Developer");
		e.setId(1L);
		when(repository.add(Mockito.any(Employee.class))).thenReturn(e);
		when(repository.findById(1L)).thenReturn(e);
	}

	@Test
	public void addPerson() throws JsonProcessingException, Exception {
		Employee employee = new Employee(1L, 1L, "John Smith", 33, "Developer");
		mockMvc.perform(post("/").contentType(MediaType.APPLICATION_JSON).content(mapper.writeValueAsString(employee)))
			.andExpect(status().isOk())
			.andDo(document("{method-name}/", requestFields(
				fieldWithPath("id").description("Employee id").ignored(),
				fieldWithPath("organizationId").description("Employee's organization id"),
				fieldWithPath("departmentId").description("Employee's department id"),
				fieldWithPath("name").description("Employee's name"),
				fieldWithPath("age").description("Employee's age"),
				fieldWithPath("position").description("Employee's position inside organization")
			)));
	}
	
	@Test
	public void findPersonById() throws JsonProcessingException, Exception {
		this.mockMvc.perform(get("/{id}", 1).accept(MediaType.APPLICATION_JSON))
			.andExpect(status().isOk())
			.andDo(document("{method-name}/", responseFields(
				fieldWithPath("id").description("Employee id"),
				fieldWithPath("organizationId").description("Employee's organization id"),
				fieldWithPath("departmentId").description("Employee's department id"),
				fieldWithPath("name").description("Employee's name"),
				fieldWithPath("age").description("Employee's age"),
				fieldWithPath("position").description("Employee's position inside organization")
			), pathParameters(parameterWithName("id").description("Employee id"))));
	}

}

If you would like to customize some settings of Spring REST Docs you should provide @TestConfiguration class inside JUnit test class. In the following code fragment you may see an example of such customization. I overridden default snippets output directory from index to test method-specific name, and force generation of sample request and responses using prettyPrint option (single parameter in the separated line).

@TestConfiguration
static class CustomizationConfiguration implements RestDocsMockMvcConfigurationCustomizer {

	@Override
	public void customize(MockMvcRestDocumentationConfigurer configurer) {
		configurer.operationPreprocessors()
			.withRequestDefaults(prettyPrint())
			.withResponseDefaults(prettyPrint());
	}
	
	@Bean
	public RestDocumentationResultHandler restDocumentation() {
		return MockMvcRestDocumentation.document("{method-name}");
	}
}

Now, if you execute mvn clean install on your project you should see the following structure inside your output directory.
rest-api-docs-3

4. Viewing and publishing API docs

Once we have successfully built our project, the documentation has been generated. We can display HTML file available at target/generated-docs/api.html. It provides the full documentation of our API.

rest-api-docs-1
And the next part…

rest-api-docs-2
You may also want to publish it inside your application fat JAR file. If you configure maven-resources-plugin following example vibisle below it would be available under /static/docs directory inside JAR.

<plugin>
	<artifactId>maven-resources-plugin</artifactId>
	<executions>
		<execution>
			<id>copy-resources</id>
			<phase>prepare-package</phase>
			<goals>
				<goal>copy-resources</goal>
			</goals>
			<configuration>
				<outputDirectory>
					${project.build.outputDirectory}/static/docs
				</outputDirectory>
				<resources>
					<resource>
						<directory>
							${project.build.directory}/generated-docs
						</directory>
					</resource>
				</resources>
			</configuration>
		</execution>
	</executions>
</plugin>

Conclusion

That’s all what I wanted to show in this article. The sample service generating documentation using Spring REST Docs is available on GitHub under repository https://github.com/piomin/sample-spring-microservices-new/tree/rest-api-docs/employee-service. I’m not sure that Swagger and Spring REST Docs should be treated as a competitive solutions. I use Swagger for simple testing an API on the running application or exposing specification that can be used for automated generation of a client code. Spring REST Docs is rather used for generating documentation that can be published somewhere, and “is accurate, concise, and well-structured. This documentation then allows your users to get the information they need with a minimum of fuss”. I think there is no obstacle to use Spring REST Docs and SpringFox Swagger together in your project in order to provide the most valuable documentation of API exposed by the application.

Advertisements

Visualizing Jenkins Pipeline Results in Grafana

This time I describe a slightly lighter topic in comparison to the some previous posts. Personally, I think Grafana is a very cool tool for visualizing any timeline data. As it turns out it is quite easy to store and visualize Jenkins build results with InfluxDB plugin.

1. Starting docker containers

Let’s begin from starting needed docker containers with Grafana, InfluxDB and Jenkins.

docker run -d --name grafana -p 3000:3000 grafana/grafana
docker run -d --name influxdb -p 8086:8086 influxdb
docker run -d --name jenkins -p 38080:8080 -p 50000:50000 jenkins

Then you can run client container which links to InfluxDB container. Using this container you can create new database with command CREATE DATABASE grafana.

docker run --rm --link=influxdb -it influxdb influx -host influxdb

2. Configuring Jenkins

After starting Jenkins you need to download some plugins. For this sample it should be the following plugins:

If you are interested in more details about Jenkins configuration and Continuous Delivery take a look on my previous article about that topic How to setup Continuous Delivery environment.

In Manage Jenkins -> Configure System section add new InfluxDB target.

grafana-2

3. Building pipeline in Jenkins

With Jenkins Pipeline Plugin we are building pipelines using Groovy syntax. In the first step (1) we checkout project from GitHub, and then build it with Maven (2). Then we publish JUnit and JaCoCo reports (3) and finally send the whole report to InfluxDB (4).

node {
	def mvnHome
	try {
		stage('Checkout') { //(1)
			git 'https://github.com/piomin/sample-code-for-ci.git'
			mvnHome = tool 'maven3'
		}
		stage('Build') { //(2)
			dir('service-1') {
				sh "'${mvnHome}/bin/mvn' -Dmaven.test.failure.ignore clean package"
			}
		}
		stage('Tests') { //(3)
			junit '**/target/surefire-reports/TEST-*.xml'
			archive 'target/*.jar'
			step([$class: 'JacocoPublisher', execPattern: '**/target/jacoco.exec'])
		}
		stage('Report') { //(4)
			if (currentBuild.currentResult == 'UNSTABLE') {
				currentBuild.result = "UNSTABLE"
			} else {
				currentBuild.result = "SUCCESS"
			}
			step([$class: 'InfluxDbPublisher', customData: null, customDataMap: null, customPrefix: null, target: 'grafana'])
		}
	} catch (Exception e) {
		currentBuild.result = "FAILURE"
		step([$class: 'InfluxDbPublisher', customData: null, customDataMap: null, customPrefix: null, target: 'grafana'])
	}
}

I defined three pipelines for one per every module from the sample.

grafana-5

4. Building services

Add jacoco-maven-plugin Maven plugin to your pom.xml to enable code coverage reporting.

<plugin>
	<groupId>org.jacoco</groupId>
	<artifactId>jacoco-maven-plugin</artifactId>
	<version>0.7.9</version>
	<executions>
		<execution>
			<id>default-prepare-agent</id>
			<goals>
				<goal>prepare-agent</goal>
			</goals>
		</execution>
		<execution>
			<id>default-report</id>
			<phase>prepare-package</phase>
			<goals>
				<goal>report</goal>
			</goals>
		</execution>
	</executions>
</plugin>

Sample application source code is available on GitHub. It consists of three simple modules, which does not do anything important, but only has JUnit tests needed for build results visualization.

5. Configuring Grafana

First, configure Grafana data source as your InfluxDB Docker container instance.

grafana-1

With InfluxDB Plugin we can report metrics generated by JUnit, Cobertura, JaCoCo, Robot Framework and Performance Plugin. In the sample application I’ll show you the reports from JUnit and JaCoCo. Let’s configure our graphs in Grafana. As you can see on the picture below I defined the graph with pipeline Build Time data. The result are grouped by project name.

grafana-4

Here are two graphs. The first illustrating every pipeline build time data in milliseconds, and second percentage test code coverage. For test coverage we need to select from jacoco_data table instead of jenkins_data and then choose field jacoco_method_coverage_rate.

grafana-3

For more details about visualizing metrics with Grafana and InfluxDB you can refer to my previous article Custom metrics visualization with Grafana and InfluxDB.

Testing REST API with Hoverfly

Hoverfly is an open source API simulation tool for automated tests. It is written in Go, but also has native support for Java and can be run inside JUnit test. Hoverfly can be used for testing REST API, but can also be useful for testing calls between microservices. We have two running modes available: simulating and capturing. In simulating mode we just simulate interaction with other service by creating response sources, in capturing mode requests will be made to the real service as normal, only they will be intercepted and recorded by Hoverfly.

In one of my previous article Testing Java Microservices I described the competitive tool for testing – Spring Cloud Contract. In the article about Hoverfly I will use the same sample application based on Spring Boot, which I created for the needs of that previous article. Source code is available on GitHub in hoverfly branch. We have some microservices which interact between each other and basing on this sample I’m going to show how to use Hoverfly for component testing.

To enable testing with Hoverfly we have to include the following dependency in pom.xml file.

<dependency>
	<groupId>io.specto</groupId>
	<artifactId>hoverfly-java</artifactId>
	<version>0.8.0</version>
	<scope>test</scope>
</dependency>

Hoverfly can be easily integrated with JUnit. We can orchestrate it using JUnit @ClassRule. Like I mentioned before we can switch between two different modes. In the code fragment below I decided two use mixed strategy inCaptureOrSimulationMode, where Hoverfly Rule is started in capture mode if the simulation file does not exist and in simulate mode if the file does exist. The default location of output JSON file is src/test/resources/hoverfly. By calling printSimulationData on HoverflyRule we are printing all simulation data on the console.

@RunWith(SpringRunner.class)
@SpringBootTest(classes = { Application.class }, webEnvironment = WebEnvironment.DEFINED_PORT)
@FixMethodOrder(MethodSorters.NAME_ASCENDING)
public class AccountApiFullTest {

	protected Logger logger = Logger.getLogger(AccountApiFullTest.class.getName());

	@Autowired
	TestRestTemplate template;

	@ClassRule
	public static HoverflyRule hoverflyRule = HoverflyRule
			.inCaptureOrSimulationMode("account.json", HoverflyConfig.configs().proxyLocalHost()).printSimulationData();

	@Test
	public void addAccountTest() {
		Account a = new Account("1234567890", 1000, "1");
		ResponseEntity<Account> r = template.postForEntity("/accounts", a, Account.class);
		Assert.assertNotNull(r.getBody().getId());
		logger.info("New account: " + r.getBody().getId());
	}

	@Test
	public void findAccountByNumberTest() {
		Account a = template.getForObject("/accounts/number/{number}", Account.class, "1234567890");
		Assert.assertNotNull(a);
		logger.info("Found account: " + a.getId());
	}

	@Test
	public void findAccountByCustomerTest() {
		Account[] a = template.getForObject("/accounts/customer/{customer}", Account[].class, "1");
		Assert.assertTrue(a.length > 0);
		logger.info("Found accounts: " + a);
	}

}

Now, let’s run our JUnit test class twice. During first attempt all requests are forwarded to the Spring @RestController which connects to embedded Mongo database. At the same time all requests and responses are recorded by Hoverfly and saved in the account.json file. This file fragment is visible below. During the second attempt all data is loaded from source file, there is no interaction with AccountController.

  "request" : {
	"path" : {
	  "exactMatch" : "/accounts/number/1234567890"
	},
	"method" : {
	  "exactMatch" : "GET"
	},
	"destination" : {
	  "exactMatch" : "localhost:2222"
	},
	"scheme" : {
	  "exactMatch" : "http"
	},
	"query" : {
	  "exactMatch" : ""
	},
	"body" : {
	  "exactMatch" : ""
	}
  },
  "response" : {
	"status" : 200,
	"body" : "{\"id\":\"5980642bc96045216447023b\",\"number\":\"1234567890\",\"balance\":1000,\"customerId\":\"1\"}",
	"encodedBody" : false,
	"templated" : false,
	"headers" : {
	  "Content-Type" : [ "application/json;charset=UTF-8" ],
	  "Date" : [ "Tue, 01 Aug 2017 11:21:15 GMT" ],
	  "Hoverfly" : [ "Was-Here" ]
	}
  }

Now, let’s take a look on customer-service tests. Inside GET /customer/{id} we are invoking method GET /accounts/customer/{customerId} from account-service. This method is simulating by Hoverfly with success response as you can see below.

@RunWith(SpringRunner.class)
@SpringBootTest(webEnvironment = WebEnvironment.DEFINED_PORT)
@FixMethodOrder(MethodSorters.NAME_ASCENDING)
public class CustomerControllerTest {

	@Autowired
	TestRestTemplate template;

	@ClassRule
	public static HoverflyRule hoverflyRule = HoverflyRule
			.inSimulationMode(dsl(service("account-service:2222").get(startsWith("/accounts/customer/"))
					.willReturn(success("[{\"id\":\"1\",\"number\":\"1234567890\"}]", "application/json"))))
			.printSimulationData();

	@Test
	public void addCustomerTest() {
		Customer c = new Customer("1234567890", "Jan Testowy", CustomerType.INDIVIDUAL);
		c = template.postForObject("/customers", c, Customer.class);
	}

	@Test
	public void findCustomerWithAccounts() {
		Customer c = template.getForObject("/customers/pesel/{pesel}", Customer.class, "1234567890");
		Customer cc = template.getForObject("/customers/{id}", Customer.class, c.getId());
		Assert.assertTrue(cc.getAccounts().size() > 0);
	}
}

To run this test successfully we should override some properties from application.yml in src/test/resources/application.yml. Eureka discovery from Ribbon client should be disabled and the same for Hystrix in @FeignClient. Ribbon listOfServers property should has same value as service address inside HoverflyRule.

eureka:
  client:
    enabled: false

ribbon:
  eureka:
    enable: false
  listOfServers: account-service:2222

feign:
  hystrix:
    enabled: false

Here’s @FeignClient implementation for invoking API method from account-service.

@FeignClient("account-service")
public interface AccountClient {

	@RequestMapping(method = RequestMethod.GET, value = "/accounts/customer/{customerId}", consumes = {MediaType.APPLICATION_JSON_VALUE})
	List<Account> getAccounts(@PathVariable("customerId") String customerId);

}

When using simulation mode there is no need to start @SpringBootTest. Hoverfly has also some interesting capabilities like response templating, for example basing on path parameter, like in the fragment below.

public class AccountApiTest {

	TestRestTemplate template = new TestRestTemplate();

	@ClassRule
	public static HoverflyRule hoverflyRule = HoverflyRule.inSimulationMode(dsl(service("http://account-service")
			.post("/accounts").anyBody().willReturn(success("{\"id\":\"1\"}", "application/json"))
			.get(startsWith("/accounts/")).willReturn(success("{\"id\":\"{{Request.Path.[1]}}\",\"number\":\"123456789\"}", "application/json"))));

	@Test
	public void addAccountTest() {
		Account a = new Account("1234567890", 1000, "1");
		ResponseEntity<Account> r = template.postForEntity("http://account-service/accounts", a, Account.class);
		System.out.println(r.getBody().getId());
	}

	@Test
	public void findAccountByIdTest() {
		Account a = template.getForObject("http://account-service/accounts/{id}", Account.class, new Random().nextInt(10));
		Assert.assertNotNull(a.getId());
	}

}

We can simulate fixed method delay using DSL. Delay be set for all requests or for a particular HTTP method. Our delayed @ClassRule for CustomerControllerTest will now look like in the fragment below.

	@ClassRule
	public static HoverflyRule hoverflyRule = HoverflyRule
			.inSimulationMode(dsl(service("account-service:2222").andDelay(3000, TimeUnit.MILLISECONDS).forMethod("GET").get(startsWith("/accounts/customer/"))
			.willReturn(success("[{\"id\":\"1\",\"number\":\"1234567890\"}]", "application/json"))));

And now you can add ReadTimeout property into your Ribbon client configuration and run JUnit test again. You should receive the follwoing exception: java.net.SocketTimeoutException: Read timed out

ribbon:
  eureka:
    enable: false
  ReadTimeout: 1000
  listOfServers: account-service:2222

Conclusion

In the post I showed you the most typical usage of Hoverfly library in microservices tests. However, this library is not dedicated to microservice testing as opposed to the Spring Cloud Contract previously described by me. For example, there is no mechanisms for sharing test stubs between different microservices like in Spring Cloud Contract (@AutoConfigureStubRunner). But there is an interesting feature for delaying responses thanks to which we can simulate some timeouts for Ribbon client or Hystrix fallback.

Testing Java Microservices

While developing a new application we should never forget about testing. This term seems to be particularly important when working with microservices. Microservices testing requires different approach than tests designing for monolithic applications. As far as monolithic testing is concerned, the main focus is put on unit testing and also in most cases integration tests with the database layer. In the case of microservices, the most important test seems to be interactions between those microservices. Although every microservice is independently developed and released the change in one of them can affect on all which are interacting with that service. Interaction between them is realized by messages. Usually these are messages send via REST or AMQP protocols.

We can divide five different layers of microservices tests. The first three of them are same as for monolith applications.

Unit tests – we are testing the smallest pieces of code, for example single method or component and mocking every call of other methods or components. There are many popular frameworks that supporting unit tests in java like JUnit, TestNG and Mockito for mocking. The main task of this type of testing is to confirm that the implementation meets the requirements.

Integration tests – we are testing interaction and communication between components basing on their interfaces with external services mocked out.

End-to-end test – also known as functional tests. The main goal of that tests is to verify if the system meets the external requirements. It means that we should design test scenarios which test all the microservices take a part in that process.

Contract tests – test at the boundary of an external service verifying that it meets the contract expected by a consuming service

Component tests – limits the scope of the exercised software to a portion of the system under test, manipulating the system through internal code interfaces and using test doubles to isolate the code under test from other components.

In the figure below we can see the component diagram of the one sample microservice (customer service). That architecture is similar for all other sample microservices described in that post. Customer service is interacting with Mongo database and storing there all customers. Mapped between object and database is realized by Spring Data @Document. We also use @Repository component as a DAO for Customer entity. Communication with other microservices is realized by @Feign REST client. Customer service collects all customer’s accounts and products from external microservices. @Repository and @Feign clients are injected into the @Controller which is exposed outside via REST resource.

testingmicroservices1

In this article I’ll show you contract and component tests for sample microservices architecture. In the figure below you can see test strategy for architecture showed in previous picture. For our tests we use embedded in-memory Mongo database and RESTful stubs generated with Spring Cloud Contract framework.

testingmicroservices2

Now, let’s take a look on the big picture. We have four microservices interacting with each other like we see in the figure below. Spring Cloud Contract uses WireMock in the backgroud for recording and matching requests and responses. For testing purposes Eureka discovering on all microservices needs to be disabled.

testingmicroservices3

Sample application source code is available on GitHub. All microservices are basing on Spring Boot and Spring Cloud (Eureka, Zuul, Feign, Ribbon) frameworks. Interaction with Mongo database is realized with Spring Data MongoDB (spring-boot-starter-data-mongodb dependency in pom.xml) library. DAO is really simple. It extends MongoRepository CRUD component. @Repository and @Feign clients are injected into CustomerController.

public interface CustomerRepository extends MongoRepository<Customer, String> {

	public Customer findByPesel(String pesel);
	public Customer findById(String id);

}

Here’s full controller code.

@RestController
public class CustomerController {

	@Autowired
	private AccountClient accountClient;
	@Autowired
	private ProductClient productClient;

	@Autowired
	CustomerRepository repository;

	protected Logger logger = Logger.getLogger(CustomerController.class.getName());

	@RequestMapping(value = "/customers/pesel/{pesel}", method = RequestMethod.GET)
	public Customer findByPesel(@PathVariable("pesel") String pesel) {
		logger.info(String.format("Customer.findByPesel(%s)", pesel));
		return repository.findByPesel(pesel);
	}

	@RequestMapping(value = "/customers", method = RequestMethod.GET)
	public List<Customer> findAll() {
		logger.info("Customer.findAll()");
		return repository.findAll();
	}

	@RequestMapping(value = "/customers/{id}", method = RequestMethod.GET)
	public Customer findById(@PathVariable("id") String id) {
		logger.info(String.format("Customer.findById(%s)", id));
		Customer customer = repository.findById(id);
		List<Account> accounts =  accountClient.getAccounts(id);
		logger.info(String.format("Customer.findById(): %s", accounts));
		customer.setAccounts(accounts);
		return customer;
	}

	@RequestMapping(value = "/customers/withProducts/{id}", method = RequestMethod.GET)
	public Customer findWithProductsById(@PathVariable("id") String id) {
		logger.info(String.format("Customer.findWithProductsById(%s)", id));
		Customer customer = repository.findById(id);
		List<Product> products =  productClient.getProducts(id);
		logger.info(String.format("Customer.findWithProductsById(): %s", products));
		customer.setProducts(products);
		return customer;
	}

	@RequestMapping(value = "/customers", method = RequestMethod.POST)
	public Customer add(@RequestBody Customer customer) {
		logger.info(String.format("Customer.add(%s)", customer));
		return repository.save(customer);
	}

	@RequestMapping(value = "/customers", method = RequestMethod.PUT)
	public Customer update(@RequestBody Customer customer) {
		logger.info(String.format("Customer.update(%s)", customer));
		return repository.save(customer);
	}

}

To replace external Mongo database with embedded in-memory instance during automated tests we only have to add following dependency to pom.xml.

<dependency>
	<groupId>de.flapdoodle.embed</groupId>
	<artifactId>de.flapdoodle.embed.mongo</artifactId>
	<scope>test</scope>
</dependency>

If we using different addresses and connection credentials also application seetings should be overriden in src/test/resources. Here’s application.yml file for testing. In the bottom there is a configuration for disabling Eureka discovering.

server:
  port: ${PORT:3333}

spring:
  application:
    name: customer-service
  data:
    mongodb:
      host: localhost
      port: 27017
  logging:
    level:
      org.springframework.cloud.contract: TRACE

eureka:
  client:
    enabled: false

In-memory MongoDB instance is started automatically during Spring Boot JUnit test. The next step is to add Spring Cloud Contract dependencies.

<dependency>
	<groupId>org.springframework.cloud</groupId>
	<artifactId>spring-cloud-starter-contract-stub-runner</artifactId>
	<scope>test</scope>
</dependency>
<dependency>
	<groupId>org.springframework.cloud</groupId>
	<artifactId>spring-cloud-starter-contract-verifier</artifactId>
	<scope>test</scope>
</dependency>

To enable automated tests generation by Spring Cloud Contract we also have to add following plugin into pom.xml.

<plugin>
	<groupId>org.springframework.cloud</groupId>
	<artifactId>spring-cloud-contract-maven-plugin</artifactId>
	<version>1.1.0.RELEASE</version>
	<extensions>true</extensions>
	<configuration>
			<packageWithBaseClasses>pl.piomin.microservices.advanced.customer.api</packageWithBaseClasses>
	</configuration>
</plugin>

Property packageWithBaseClasses defines package where base classes extended by generated test classes are stored. Here’s base test class for account service tests. In our sample architecture account service is only a produces it does not consume any services.

@RunWith(SpringRunner.class)
@SpringBootTest(classes = {Application.class})
public class ApiScenario1Base {

	@Autowired
	private WebApplicationContext context;

	@Before
	public void setup() {
		RestAssuredMockMvc.webAppContextSetup(context);
	}

}

As opposed to the account service customer service consumes some services for collecting customer’s account and products. That’s why base test class for customer service needs to define stub artifacts data.

@RunWith(SpringRunner.class)
@SpringBootTest(classes = {Application.class})
@AutoConfigureStubRunner(ids = {"pl.piomin:account-service:+:stubs:2222"}, workOffline = true)
public class ApiScenario1Base {

	@Autowired
	private WebApplicationContext context;

	@Before
	public void setup() {
		RestAssuredMockMvc.webAppContextSetup(context);
	}

}

Test classes are generated on the basis of contracts defined in src/main/resources/contracts. Such contracts can be implemented using Groovy language. Here’s sample contract for adding new account.

org.springframework.cloud.contract.spec.Contract.make {
  request {
    method 'POST'
    url '/accounts'
	body([
	  id: "1234567890",
          number: "12345678909",
          balance: 1234,
	  customerId: "123456789"
	])
	headers {
	  contentType('application/json')
	}
  }
response {
  status 200
  body([
    id: "1234567890",
    number: "12345678909",
    balance: 1234,
    customerId: "123456789"
  ])
  headers {
    contentType('application/json')
  }
 }
}

Test class are generated under target/generated-test-sources catalog. Here’s generated class for the code above.

@FixMethodOrder(MethodSorters.NAME_ASCENDING)
public class Scenario1Test extends ApiScenario1Base {

	@Test
	public void validate_1_postAccount() throws Exception {
		// given:
			MockMvcRequestSpecification request = given()
					.header("Content-Type", "application/json")
					.body("{\"id\":\"1234567890\",\"number\":\"12345678909\",\"balance\":1234,\"customerId\":\"123456789\"}");

		// when:
			ResponseOptions response = given().spec(request)
					.post("/accounts");

		// then:
			assertThat(response.statusCode()).isEqualTo(200);
			assertThat(response.header("Content-Type")).matches("application/json.*");
		// and:
			DocumentContext parsedJson = JsonPath.parse(response.getBody().asString());
			assertThatJson(parsedJson).field("id").isEqualTo("1234567890");
			assertThatJson(parsedJson).field("number").isEqualTo("12345678909");
			assertThatJson(parsedJson).field("balance").isEqualTo(1234);
			assertThatJson(parsedJson).field("customerId").isEqualTo("123456789");
	}

	@Test
	public void validate_2_postAccount() throws Exception {
		// given:
			MockMvcRequestSpecification request = given()
					.header("Content-Type", "application/json")
					.body("{\"id\":\"1234567891\",\"number\":\"12345678910\",\"balance\":4675,\"customerId\":\"123456780\"}");

		// when:
			ResponseOptions response = given().spec(request)
					.post("/accounts");

		// then:
			assertThat(response.statusCode()).isEqualTo(200);
			assertThat(response.header("Content-Type")).matches("application/json.*");
		// and:
			DocumentContext parsedJson = JsonPath.parse(response.getBody().asString());
			assertThatJson(parsedJson).field("id").isEqualTo("1234567891");
			assertThatJson(parsedJson).field("customerId").isEqualTo("123456780");
			assertThatJson(parsedJson).field("number").isEqualTo("12345678910");
			assertThatJson(parsedJson).field("balance").isEqualTo(4675);
	}

	@Test
	public void validate_3_getAccounts() throws Exception {
		// given:
			MockMvcRequestSpecification request = given();

		// when:
			ResponseOptions response = given().spec(request)
					.get("/accounts");

		// then:
			assertThat(response.statusCode()).isEqualTo(200);
			assertThat(response.header("Content-Type")).matches("application/json.*");
		// and:
			DocumentContext parsedJson = JsonPath.parse(response.getBody().asString());
			assertThatJson(parsedJson).array().contains("balance").isEqualTo(1234);
			assertThatJson(parsedJson).array().contains("customerId").isEqualTo("123456789");
			assertThatJson(parsedJson).array().contains("id").matches("[0-9]{10}");
			assertThatJson(parsedJson).array().contains("number").isEqualTo("12345678909");
	}

}

In the generated class there are three JUnit tests because I used scenario mechanisms available in Spring Cloud Contract. There are three groovy files inside scenario1 catalog like we can see in the picture below. The number in every file’s prefix defines tests order. Second scenario has only one definition file and is also used in the customer service (findById API method). Third scenario has four definition files and is used in the transfer service (execute API method).

scenarios

Like I mentioned before interaction between microservices is realized by @FeignClient. WireMock used by Spring Cloud Contract records request/response defined in scenario2 inside account service. Then recorded interaction is used by @FeignClient during tests instead of calling real service which is not available.

@FeignClient("account-service")
public interface AccountClient {

	@RequestMapping(method = RequestMethod.GET, value = "/accounts/customer/{customerId}", consumes = {MediaType.APPLICATION_JSON_VALUE})
	List<Account> getAccounts(@PathVariable("customerId") String customerId);

}

All the tests are generated and run during Maven build, for example mvn clean install command. If you are interested in more details and features of Spring Cloud Contract you can it here.

Finally, we can define Continuous Integration pipeline for our microservices. Each of them should be build independently. More about Continuous Integration / Continuous Delivery environment could be read in one of previous post How to setup Continuous Delivery environment. Here’s sample pipeline created with Jenkins Pipeline Plugin for account service. In Checkout stage we are updating our source code working for the newest version from repository. In the Build stage we are starting from checking out project version set inside pom.xml, then we build application using mvn clean install command. Finally, we are recording unit tests result using junit pipeline method. Same pipelines can be configured for all other microservices. In described sample all microservices are placed in the same Git repository with one Maven version for simplicity. But we can imagine that every microservice could be inside different repository with independent version in pom.xml. Tests will always be run with the newest version of stubs, which is set in that fragment of base test class with +: @AutoConfigureStubRunner(ids = {“pl.piomin:account-service:+:stubs:2222”}, workOffline = true)

node {

    withMaven(maven: 'Maven') {

        stage ('Checkout') {
            git url: 'https://github.com/piomin/sample-spring-microservices-advanced.git', credentialsId: 'github-piomin', branch: 'testing'
        }

        stage ('Build') {
            def pom = readMavenPom file: 'pom.xml'
            def version = pom.version.replace("-SNAPSHOT", ".${currentBuild.number}")
            env.pom_version = version
            print 'Build version: ' + version
            currentBuild.description = "v${version}"

            dir('account-service') {
                bat "mvn clean install -Dmaven.test.failure.ignore=true"
            }

            junit '**/target/surefire-reports/TEST-*.xml'
        }

    }

}

Here’s pipeline vizualization on Jenkins Management Dashboard.

account-pipeline