PACT Consumer Driven setUP test data in Provider - pact

I am executing some test whereby if Consumer set some ID or any Text which is not exists inside Provider Database then I want to do the below step in Provider Tests
Receive the PACT file with the information as what are the things needs to setup first
Then I will have my function , which will start inserting those unavailable data into DB
Then make API calls to , which will provide the Actual response.
Now I want to know , which field Consumer should use to let Provider know that , there is some prerequisite or pre setup needed before actual API call.
I saw the sample , where there is a setUp : InsertIntoDatabase but doesnot say that how to find which is the input supplied by consumer.

[TestMethod]
public void Ensure_OfferApi_HonoursPact_WithDeal_ForSendingLatestSoftOffer()
{
//Arrange
var outputter = new CustomOutputter();
var config = new PactVerifierConfig();
config.ReportOutputters.Add(outputter);
IPactVerifier pactVerifier = new PactVerifier(() => { InsertEventIntoDatabase(); }, () => { }, config);
pactVerifier
.ProviderState(
"Given the Offer Exist in Offer System I WANT TO See Latest SoftOffer",
setUp: InsertEventsIntoDatabase); // in case you want to insert something
//Act / Assert
using (var client = new HttpClient { BaseAddress = new Uri("http://localhost:9999") })
{
pactVerifier
.ServiceProvider("Offer API", client)
.HonoursPactWith("Consumer")
.PactUri(#"C:\TOSS\TestSample\log\deal-offer.json")
.Verify();
}
// Verify that verifaction log is also sent to additional reporters defined in the config
Assert.IsNotNull(outputter.Output);
}
Lets say the setup function is InsertEventsIntoDatabase and I want to add events what ever consumer is providing via PACT file. so that I dont need to update this code when ever Consumer changes the input.

Related

Getting started querying sync data from Realm

I'm attempting to get a list of items from a MongoDB Atlas instance via Realm into my Xamarin Forms application. I'm pretty sure I've followed the instructions correctly, however I can't seem to see any values come in.
I'm using Realm and Realm.Fody version: 10.2.0
I've uploaded my data with the following script
mongoimport --uri mongodb+srv://[user]:[pass]#cluster0.[wxyz].mongodb.net/[company] --collection products --type json --file products --jsonArray
1057 document(s) imported successfully. 0 document(s) failed to import.
When I go to MongoDB Atlas, I see this
In my App.xaml.cs constructor, I create a realm app and log in with anonymous credentials
public const string MasterDataPartitionKey = "master_data";
ctor()
{
var app = Realms.Sync.App.Create("my-app-id");
app.LogInAsync(Credentials.Anonymous()).ContinueWith(task => User = task.Result);
}
After this, in my first ViewModel constructor (it's Rx, but isn't doing anything fancy)
ctor()
{
_listProductsCommand = ReactiveCommand.Create<Realm, IEnumerable<Product>>(ExecuteLoadProducts);
_listProductsCommand.ThrownExceptions.Subscribe(x => { /* checking for errors here */ }).DisposeWith(TrashBin);
Initialize
.ObserveOn(RxApp.MainThreadScheduler)
// note MasterDataPartitionKey is the value I've set on every single record's `_partitionKey` property.
.Select(_ => new SyncConfiguration(MasterDataPartitionKey, App.User))
.SelectMany(Realm.GetInstanceAsync)
.Do(_ => { }, ex => { /* checking for errors here */ })
.ObserveOn(RxApp.MainThreadScheduler)
.InvokeCommand(this, x => x._listProductsCommand)
.DisposeWith(TrashBin);
}
And later, a simple query
private static IEnumerable<Product> ExecuteLoadProducts(Realm realm)
{
var data = realm.All<ProductDto>();
var productDtos = data.ToList();
var products = productDtos.Select(x => x.Map());
return products;
}
Expected:
productDtos (and products) should have a count of 1057
Actual:
productDtos (and products) have a count of 0
I've looked in my Realm logs and I can see the connections and sync logs
My anonymous authentication is turned on
and I've made it so that all of my Collections have read access (in fact developer mode is on)
Here's an example of one of the records
Here's a snipped of the dto I'm trying to pull down
I feel as though I must be missing something simple. Can anyone see anything obvious that could be sending me sideways?

Been trying to set Custom time on a file using Firebase?

I'm trying to set the Custom time attribute in firebase on the front end. Everything is possible to set, like contentDisposition, custom Metadata etc, just can't find any way or any info about setting Custom time.
You can see it referenced here https://cloud.google.com/storage/docs/metadata#custom-time
You can set the custom time on the file manually in the Storage cloud console, but even when you do and you load the file in firebase on the front end, it's missing from the returned object! (makes me feel like it's not possible to achieve this)
var storage = this.$firebase.app().storage("gs://my-files");
var storage2 = storage.ref().child(this.file);
//// Tried this
var md = {
customTime: now.$firebase.firestore.FieldValue.serverTimestamp()
};
//// & Tried this
var md = {
Custom-Time: now.$firebase.firestore.FieldValue.serverTimestamp()
};
storage2.updateMetadata(md).then((metadata) => {
console.log(metadata);
}).catch((err) => {
console.log(err);
});
The reason I ask is I'm trying to push back the lifecycle delete date (which will be based on the custom time) every time the file is loaded. Does anyone know the answer or an alternative way of doing it?
Thanks in advance
The CustomTime metadata is not possible to update using Firebase JavaScript SDK since it is not included in the file metadata properties list mentioned in the documentation. So even if you specify it as customTime: or Custom-Time: the updateMetadata() method does not perform any changes.
I suggest you as a better practice, set the CustomTime metadata from the cloud console and modify the CustomTimeBefore Lifecycle condition from the back-end each time you load the file using the addLifeCycleRule method of the GCP Node.js Client.
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
//Imports your Google Cloud Storage bucket
const myBucket = storage.bucket('my_bucket');
//-
// Delete object that has a customTime before 2021-05-25.
//-
myBucket.addLifecycleRule({
action: 'delete',
condition: {
customTimeBefore: new Date('2021-05-25')
}
}, function(err, apiResponse) {});

Ingest from storage with persistDetails = true not save ingest status result

I'm now implement a program to migrate large amount of data to ADX base on Ingest from Storage feature of ADX and I'm need to check that status of each ingestion request each time the request finish but I'm facing an issue
Base on MS document in here
If I set the persistDetails = true for example with the command below it must save the ingestion status but currently this setting seem not work (with or without it)
.ingest async into table MigrateTable
(
h'correct blob url link'
)
with (
jsonMappingReference = 'table_mapping',
format = 'json',
persistDetails = true
)
Above command will return an OperationId and when I using it to check export status when the ingest task finish I always get this error message :
Error An admin command cannot be executed due to an invalid state: State='Operation 'DataIngestPull' does not persist its operation results' clientRequestId: KustoWebV2;
Can someone clarify for me what is the root cause relate to this? With me it seem like a bug relate to ADX
Ingesting data directly against the Data Engine, by running .ingest commands, is usually not recommended, compared to using Queued Ingestion (motivation included in the link). Using Kusto's ingestion client library allows you to track the ingestion status.
Some tools/services already do that for you, and you can consider using them directly. e.g. LightIngest, Azure Data Factory
If you don't follow option 1, you can still look for the state/status of your command using the operation ID you get when using the async keyword, by using .show operations
You can also use the client request ID to filter the result set of .show commands to view the state/status of your command.
If you're interested in looking specifically at failures, .show ingestion failures is also available for you.
The persistDetails option you specified in your .ingest command actually has no effect - as mentioned in the docs:
Not all control commands persist their results, and those that do usually do so by default on asynchronous executions only (using the async keyword). Please search the documentation for the specific command and check if it does (see, for example data export).
============ Update sample code follow suggestion from Yoni ========
Turn out, other member in my team mess up with access right with adx, after fixing it everything work fine
I just have one concern relate to PartiallySucceeded that need clarify from #yoni or someone have better knowledge relate to that
try
{
var ingestProps = new KustoQueuedIngestionProperties(model.DatabaseName, model.IngestTableName)
{
ReportLevel = IngestionReportLevel.FailuresAndSuccesses,
ReportMethod = IngestionReportMethod.Table,
FlushImmediately = true,
JSONMappingReference = model.IngestMappingName,
AdditionalProperties = new Dictionary<string, string>
{
{"jsonMappingReference",$"{model.IngestMappingName}" },
{ "format","json"}
}
};
var sourceId = Guid.NewGuid();
var clientResult = await IngestClient.IngestFromStorageAsync(model.FileBlobUrl, ingestProps, new StorageSourceOptions
{
DeleteSourceOnSuccess = true,
SourceId = sourceId
});
var ingestionStatus = clientResult.GetIngestionStatusBySourceId(sourceId);
while (ingestionStatus.Status == Status.Pending)
{
await Task.Delay(WaitingInterval);
ingestionStatus = clientResult.GetIngestionStatusBySourceId(sourceId);
}
if (ingestionStatus.Status == Status.Succeeded)
{
return true;
}
LogUtils.TraceError(_logger, $"Error when ingest blob file events, error: {ingestionStatus.ErrorCode.FastGetDescription()}");
return false;
}
catch (Exception e)
{
return false;
}

How do I add middleware to the Rebus message processing pipeline, before and after sending a message, and before and after handling a message?

I need this to simplify the implementation of the following typical, routine operations:
I would like to capture the user's context before sending the message and restores the user context before the message is handling, similar to how it was done in the following legacy example: https://github.com/rebus-org/RebusSamples/tree/master/old/UserContextHeaders
I would like to validate and deduplicating messages before handling them and log results after the message is handling.
As the question author correctly figured out, the Rebus.Events package provides readable and accessible ways of hooking into before/after messages are sent/received.
If that is sufficient, I would definitely go with that.
However, if e.g. you want to WRAP the entire processing of a single message inside a try/finally (which I recommend you when you restore the sending user's identity to process a message), you probably want to look at the native extension mechanism, which is based on decorators.
You can read the wiki page about extensibility about how to extend Rebus by decorating its pipelines.
For example, to do something with the current claims principal before and after handling a message, you can implement an "incoming pipeline step" like this:
[StepDocumentation("Write a nice descriptoion here")]
class MyIncomingStep : IIncomingStep
{
public async Task Process(IncomingStepContext context, Func<Task> next)
{
var originalPrincipal = ClaimsPrincipal.Current;
try
{
// establish user identity here
ClaimsPrincipal.Current = ...
// handle message
await next();
}
finally
{
ClaimsPrincipal.Current = originalPrincipal;
}
}
}
and then you can decorate Rebus' IPipeline with a "step injector", declaratively stating where in the pipeline you want the step to be inserted:
.Options(o => {
o.Decorate<IPipeline>(c =>
{
var pipeline = c.Get<IPipeline>();
var stepToInject = new MyIncomingStep();
return new PipelineStepInjector(pipeline)
.OnReceive(stepToInject, PipelineRelativePosition.Before, typeof(DispatchIncomingMessageStep));
});
})
and then – to makes things pretty – you can wrap the code above inside an extension method for OptionsConfigurer, making for a much prettier configuration syntax:
.Options(o => {
o.RestoreClaimsPrincipalWhenHandlingMessages();
})
or whatever you think it should be called :)
Everything works in an analogous fashion when sending messages, you just want to
//....
return new PipelineStepInjector(pipeline)
.OnSend(stepToInject, PipelineRelativePosition.Before, typeof(SerializeOutgoingMessageStep));
instead.

Protractor + CucumberJS MySQL Query

Currently my automation framework uses protractor from cucumberJS. We use chai as promised as a assertion library, and I have recently come across a need to do direct mysql queries against a database.
How would I structure a step-definition to be able to get a query, and use the query results within the same step? My current struggles are the asynchronous way protractor is being run, causing me to perform the query after the step requiring the query results happens, and also the scope of which to pass the JSON Object that is created as a result of the query.
this.loginWithMysqlUser = function(uname) {
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'localhost',
user : '*******',
password : '*******',
database : '*******'
});
connection.connect();
connection.query('SELECT * FROM prot_users WHERE username = ?', [uname], function(err, rows) {
if(err) throw err;
mysqlUser = {
username: rows[0].username,
password: rows[0].password
};
});
connection.end();
loginpage.login(mysqlUser);
};
This function resides on loginpage declaration.
So typically your cucumber test script would like:
Feature: As an admin I would like to check if a customer has an
account
Scenario: Check that customer name is registered in DB
Given that I am logged in as admin
And I wish to check that customer (foo) is registered
Then I expect following details from DB query:
| username | password | database |
| foo | bar | mysql |
with step definitions for:
Given(/^that I am logged in as admin$/, function(callback){
..
logic goes here
..
});
And(/^I wish to check that customer (foo) is registered$/,
function(username, callback){
// create connection with db as described
// with supplied username
// Use a promise to create mySQL connection
// and queries DB based on username as described
// on successful resolution set DBResult to results
// for username, password and database
// on error set DBResult to undefined
});
Then(/^I expect following details from DB query$/, function(data,
callback)
{
var rows = data.raw;
// extract values of input table cells into DBExpect using
// rows[0].username
// rows[0].password
// rows[0].database
// Check equality of DBResult and DBExpect objects
..
expect.isFulfilled(DBResult).toEqual(DBExpect).notify(callback);
});
I ended up containing all of the logic for the login and functions that needed to work with the data within the connection.query function.
Seemed to work ok, and protractor was able to be called from within that query function.

Resources