Skip to main content

Contract Testing

Why and what is contract testing?


The distributed nature of microservices, combined with the sheer number that live within an application, make it much harder for developers to perform the integration tests that were a straightforward, routine part of monolithic app development.


Contract testing is a technique for testing an integration point by checking each application in isolation to ensure the messages it sends or receives conform to a shared understanding that is documented in a "contract".


Consumer-driven contract testing is a type of contract testing that ensures that a provider is compatible with the expectations that the consumer has of it. This involves checking that the provider accepts the expected requests, and that it returns the expected responses.



Consumer contract testing provides the following benefits:


  • Contract tests provide a mechanism to explicitly verify that a provider microservice meets a consumer’s contract/needs.
  • These HTTP requests and responses are used in the mock HTTP server to mock the service provider. The interactions are then used to generate the contract between a service consumer and a service provider.
  • If the provider returns something unexpected, Pact marks the interaction as a failure, and the contract test fails.
  • A consumer’s contract test is made available to the provider to verify. By testing each side of an integration point using a simulated version of the other microservice, we get two sets of tests, one to define the interaction required and the other (playback) to verify the provider is meeting the needs defined in the consumer contract.


Contract testing for microservices is not without challenges. 


The main challenge is that you'll need to run many individual tests to evaluate all microservices within your application. In this sense, contract testing is trickier to orchestrate than end-to-end integration testing. Integration testing only requires one testing environment and one set of tests. Contract testing requires teams to run a long series of tests, each for a different pair of microservices.


Another major limitation of contract testing revolves around the use of mocks. Although a mock is a reliable way of mimicking microservice behavior without running a microservice, there's no guarantee that the actual microservice will behave in the same way when in production. Due to the nature of the instances spun up, integration testing isn't subject to this same limitation.


Finally, teams must be sure to keep contracts up to date. The contract testing process needs to loop in each development team and its members so that they can help to define new contract requirements whenever they make changes to a microservice's functionality or behavior. Otherwise, tests will become based on contracts that no longer accurately reflect the desired behavior of a microservice.

Comments

Popular posts from this blog

How to upload to Google Cloud Storage buckets using CURL

Signed URLs are pretty nifty feature given by Google Cloud Platform to let anyone access your cloud storage (bucket or any file in the bucket) without need to sign in. Official documentation gives step by step details as to how to read/write to the bucket using gsutil or through a program. This article will tell you how to upload a file to the bucket using curl so that any client which doesn't have cloud SDK installed can do this using a simple script. This command creates a signed PUT URL for your bucket. gsutil signurl -c 'text/plain' -m PUT serviceAccount.json gs://test_bucket_location Here is my URL: https://storage.googleapis.com/test_sl?GoogleAccessId=my-project-id@appspot.gserviceaccount.com&Expires=1490266627&Signature=UfKBNHWtjLKSBEcUQUKDeQtSQV6YCleE9hGG%2BCxVEjDOmkDxwkC%2BPtEg63pjDBHyKhVOnhspP1%2FAVSr%2B%2Fty8Ps7MSQ0lM2YHkbPeqjTiUcAfsbdcuXUMbe3p8FysRUFMe2dSikehBJWtbYtjb%2BNCw3L09c7fLFyAoJafIcnoIz7iJGP%2Br6gAUkSnZXgbVjr6wjN%2FIaudXIqA

Running Apache Beam pipeline using Spark Runner on a local standalone Spark Cluster

The best thing about Apache Beam ( B atch + Str eam ) is that multiple runners can be plugged in and same pipeline can be run using Spark, Flink or Google Cloud Dataflow. If you are a beginner like me and want to run a simple pipeline using Spark Runner then whole setup may be tad daunting. Start with Beam's WordCount examples  which help you quickstart with running pipelines using different types of runners. There are code snippets for running the same pipeline using different types of runners but here the code is running on your local system using Spark libraries which is good for testing and debugging pipeline. If you want to run the pipeline on a Spark cluster you need to do a little more work! Let's start by setting up a simple standalone single-node cluster on our local machine. Extending the cluster is as easy as running a command on another machine, which you want to add to cluster. Start with the obvious: install spark on your machine! (Remember to have Java a

java.lang.IllegalArgumentException: Malformed \uxxxx encoding

I was getting this exception during build while running ant. Googling didn't help much and I was flummoxed because the same code was running fine till now. My code reads a text file and does some operations on the basis of values read. It was only when I saw the text files I understood the error. I had copied the text in wordpad and saved it as .txt file. Wordpad had put lot of formatting information before and after the content. Also there was "\par" after every line, which was giving this error. So moral of the story: if you get this exception check your properties file (or any other file that your code might be reading.)