Is there a way to add pact interaction into to JSON file and add to consumer test code in JAVA - pact

We have a requirement that we need have a pact standalone server and interactions (request/ expected response pairs) in a JSON files and store it in a directory and just add to consumer tests and generate a pact file. I would like to know if this possible in PACT using java?if yes, Could you please also provide an example?

I think the question is about generating contracts from JSON files.
See this note about generating contracts: https://docs.pact.io/faq/#can-i-generate-my-pact-file-from-something-like-swagger.
You can definitely read in JSON documents in Java and convert to the matching DSL.
It's easier in languages like JS because JSON is more native, but ultimately you need to be careful about things like getting the matching rules right otherwise you'll create very brittle contracts that are hard to verify on the provider side.
See also https://docs.pact.io/consumer which gives you advice around writing good consumer tests.

Related

Why SAX.writer(XML) and not CGI wrapper(HTML - {&OUT})

I want to understand the difference between SAX.writer and CGI wrapper. I can't find any get started information, any suggested content OR video link it can be very appreciated thanks.
The SAX writer is a set of language statements/elements that allow the creation of well-formed XML documents. This XML is output to the location specified by the SET-OUTPUT-DESTINATION method. Output destinations include streams (which might include a Classic WebSpeed stream (WEB-STREAM).
CGI wrapper is more of an approach, with a bunch of (internal) procedures that let you create a fully-formed HTTP response (and read from an incoming HTTP request). This approach should not be used for new web services, even though it still works. In newer version of OpenEdge the PASOE server provides what are known as WebHandlers, which replace the CGI wrapper approach.
The {&OUT} 'syntax' is really just a preprocessor that does something like PUT UNFORMATTED STREAM WEB-STREAM - you can see this if you compile your programs with the PREPROCESS option, or use the equivalent command in PDSOE (a right-click option).

Query tool for RocksDB?

I have just started taking a look at rocksdb and was able to build a small springboot based app to perform basic CRUD operations on it. However, I was wondering if there is a ui tool that can be used to query or browse the data in rocksdb.
I am not sure if this is a valid question, but something like pgadmin for postgres or a client utility that can be used to browse through the data in this db?
Thanks, HK.
No, there isn't.
The reason for that is that there is no way a GUI client can know how to deserialize your data format.
I might save my values as pure strings - you might have it as json bytes - someone else might use protobuf - how would the gui client know how to deserialize it and show it in the UI?
Or would it just show the bytes? Which is unlikely to be useful

Pact. How to test a REST GET with automatically generated ID in the URL

I want to test a REST service that returns the detail of a given entity identified by an UUID, i.e. my consumer pact has an interaction requesting a GET like this:
/cities/123e4567-e89b-12d3-a456-426655440000
So I need this specific record to exist in the Database for the pact verifier to find it. In other projects I've achieved this executing an SQL INSERT in the state setup, but in this case I'd prefer to use the microservice's JPA utilities for accessing to the DB, because the data model is quite complex and using these utilities would save me much effort and make the test much more maintainable.
The problem is that these utilities do not allow specifying the identifier when you create a new record (they assign an automatic ID). So after creating the entity (in the state setup) I'd like to tell the pact verifier to use the generated ID rather than the one specified by the consumer pact.
As far as I know, Pact matching techniques are not useful here because I need the microservice to receive this specific ID. Is there any way for the verifier to be aware of the correct ID to use in the call to the service?
You have two options here:
Option 1 - Find a way to use the UUID from the pact file
This option (in my option) would be the better one, because you are using well known values for you verification. With JPA, I think you may be able to disable the auto-generation of the ID. And if you are using Hibernate as the JPA provider, it may not generate an ID if you have provided it one (i.e. setting the ID on the entity to the one from the pact file before saving it). This is what I have done recently.
Using a generator (as mentioned by Beth) would be a good mechanism for this problem, but there is no current way to provide a generator to use a specific value. They generate random ones on the fly.
Option 2 - Replace the ID in the URL
Depending on how you run the verification, you could use a request filter to change the UUID in the URL to the one which was created during the provider state callback. However, I feel this is a potentially bad thing to do, because you could change the request in a way that weakens the contract. You will not be verifying that your provider adheres to what the consumer specified.
If you choose this option, be careful to only change the UUID portion of the URL and nothing else.
For information on request filters, have a look at Gradle - Modifying the requests before they are sent and JUnit - Modifying the requests before they are sent in the Pact-JVM readmes.
Unfortunately not. The provider side verifier takes this information from the pact file itself and so can't know how to send anything else.
The best option is to use provider states to manage the injection of the specific record prior this test case (or to just have the correct record in there in the first place).
You use the JPA libraries during the provider state setup to modify the UUID in record to what you're expecting.
If you are using pact-jvm on both the consumer and provider sides, I believe you may be able to use 'generators', but you'll need to look up the documentation on that as I haven't used them.

Is JSON still used in applications

I wanted to know if JSON is still used in live applications? I am creating a service and want to understand if I should output data using JSON too?
What is the latest standard now?
JSON is very popular, and there is no sign that this is changing.
I am creating a service and want to understand if I should output data using JSON too?
You really need to ask the potential customers of the service that question. Or at least, give us some hint as to what the service is and what clients are likely to use it.
What is the latest standard now?
There is no official standard for JSON. In theory, JSON is a subset of ECMAScript (aka JavaScript), so the relevant ECMAScript standard would be normative.
In practice, JSON is implemented in many languages independently of ECMAScript. The description on the JSON.org website, and IETF RFC 4627 are probably the most relevant to someone implementing JSON for themselves, but neither of these sources have the authority of a standard. If you want JSON libraries, the JSON.org site is a good place to start looking.
Yes, JSON is still very popular. Even Google web services API gives search output in JSON.
Take a look at this example:
http://zamples.com/JspExplorer/samples/google.jsp
Overwhelmingly yes. For me, JSON is the transport format of choice for AJAX requests and inter-application data sharing. To date, there are 1271 questions about JSON on SO.

BizTalk custom adaptor

I am not sure if I ask the right question, but this is the scenario I am trying to run:
Multiple files (XML and a few related files, "attachments") have to get into BizTalk as a single message. I have looked into existing adapters, and don't see that done with existing once. To be more accurate, files are taken from file system. Files are not found at the same time, but arrive one at a time, when order is not ensured. XML (content) file is the one that knows what attachments it has to have (what other files).
We are looking into BizTalk 2009 and I was wondering would be that responsibility of a custom Adaptor, or something else. And were I could look for samples.
Thanks.
It is probably possible to do what you want using a custom adapter, though I'd recommend against it. You can achieve what you require using orchestration.
What you are looking for is likey a convoy, or at the least some use of correlation.
In BizTalk a convoy is a messaging pattern (as opposed a BizTalk feature) that allows groups of messages to be processed by a single orchestration.
You essentially use correlation on a receive port to group messages together in either a parallel (what you probably want) or sequential fashion.
There is an article [here](http://msdn.microsoft.com/en-us/library/ms942189(BTS.10\).aspx) by Stephen W. Thomas about convoys (it is for BT 2004 but the concepts still hold) and there is a lot of additional information on the web and in books (Professional BizTalk server 2006 has a subsection on them)
Without more details on your scenario it is hard to know exactly how the convoy would be built but below are two approaches to look at (also, I've not had a chance to properly use BT2009 so there may be extended support for correlation scenarios that help you out).
Flexible Correlation
If you don't know anything about the files listed in the context XML you will probably need a pattern like the one described by Charles Young in this post.
Non-uniform sequential convoy
If you do have a little bit of info before hand one way might be as follows (basically a Non-uniform sequential convoy):
This makes the assumption that there is some way of linking all the files together so you can correlate them.
Create a single orchestration that subscribes to you inbound receive port (which contains the file receive location).
This orchestration will have a single activation receive shape that is set up for your content file.
Once the orchestration is started by a content file a second correlated receive shape starts picking up the messages that match that content file. (this second receive could possible be in a loop to allow for variable numbers of files)
You then pack them all together into a single outbound file of your design and send them out once the full number of files has been received.
Seems to me a better approach would be to implement the above requirements with a combination of a custom pipeline component and/or a custom adapter. I assume you do not really need to manipulate the incoming files - except for the content XML file - or that you couldn't since they are in binary format. This calls for a custom pipeline component.
What you can do is develop a custom BizTalk adapter to interact with the file system and to implement the listening and looping logic. Next you can develop a custom pipeline component to create a single BizTalk message perhaps with base64 data type in it for binary data. Additionally you could also promote messages right in this component to enable orchestration subscriptions.
Orchestrations are more suited for implementing business work-flow scenarios where the messages are already in XML format. This do not appear to be the case. In any case I think at the very least a custom pipeline component would be needed.
David's answer is the correct answer.
Even in cases where you don't know absolutely nothing about the contents of the expected attachments, surely you know their names and locations. Therefore you can use the Flexible Correlation linked to in david's answer like this:
The key to the solution is to correlate on the builtin BTS.ReceivedFileName property.
First, create a custom receive pipeline, with a custom pipeline component that promotes the BTS.ReceivedFileName context property of the received messages. This simple custom component is fairly easy to write but you can make it straightforward by using third-party frameworks such as, (shameless plug, here) my PipelineComponentBase class or the excellent BizTalk Server Pipeline Component Wizard.
Now for the easy part:
Attachments are received in a specific location, designated by its path on the filesystem.
Create a receive location that listens to an alternate location, used only to control when files are actually swallowed by BizTalk.
In your orchestration, create a correlation type with the BTS.ReceivedFileName property and a correlation set base on this correlation type.
When you want to receive binary attachments, send a dummy message with the BTS.ReceivedFileName context property set to the filename of the binary attachment but with the path matching the alternate location ; the one used by the receive location. Initialize the correlation on the send shape.
Use an expression shape to copy the binary file from its original location to the one used by the receive location.
Finally, use a receive shape bound to the receive port that contains the receive location whose custom receive pipeline will promote the BTS.ReceivedFileName property.
Notice that you actually need to send a message in order to initialize the correlation. It does not matter what message you send actually. What I'd do is send the message through a send pipeline that contains an empty pipeline component. That is a pipeline component that reads the message but return null (so that the message vanishes into thin air before it reaches the adapter). A more elaborate solution would be to use a null adapter. That is an adapter that reads the message but does not do anything about it.
These two solutions avoid having many files accumulate in a temporary location somewhere, just for the sake of initializing a correlation!

Resources