Pipeline Code for Querying Aurora Serverless in AWS AppSync - pipeline

Before It was easy to write queries in request template and response template using statements. Now appsync resolver is connected to pipeline where the code has to be written. I am unable to figure out how to write the query inside the pipeline function. Could you please help me out?

Related

PACT - Handling provider service state and running actual provider with mocked or actual database

I am new to PACT and trying to use pact-net for contract testing for a .net microservice. I understand the concept of consumer test which generates a pact file.
There is the concept of a provider state middleware which is responsible for making sure that the provider's state matches the Given() condition in the generated pact.
I am bit confused on the following or how to achieve this:
The provider tests are run against the actual service. So we start the provider service before tests are run. My provider service interacts with a database to store and retrieve records. PACT also mentions that all the dependencies of a service should be stubbed.
So we run the actual provider api that is running against the actual db?
If we running the api against actual db how do we inject the data into the db? Should we be using the provider api's own endpoints to add the Given() data?
If the above is not the correct approach then what is?
All the basic blog articles I have come across do not explain this and usually have examples with no provider states or states that are just some text files on the file system.
Help appreciated.
I'm going to add to Matt's comment, you have three options:
Do your provider test with a connected environment but you will have to do some cleanup manually afterwards and make sure your data is always available in your db or/and the external APIs are always up and running. Simple to write but can be very hard to maintain.
You mock your API calls but call the real database.
You mock all your external dependencies: the API and the DB calls.
For 2) or 3) you will have to have test routes and inject the provider state middleware in your provider test fixture. Then, you can configure provider states to be called to generate in-memory data if solution 3) or add some data-init if you are in solution 2)
You can find an example here: https://github.com/pact-foundation/pact-net/tree/master/Samples/EventApi/Provider.Api.Web.Tests
The provider tests are run against the actual service
Do you mean against a live environment, or the actual service running locally to the unit test (the former is not recommended, because of (2) above).
This is one of the exceptions to that rule. You can choose to use a real DB or an in-memory one - whatever is most convenient. It's common to use docker and tools like that for testing.
In your case, I'd have a specific test-only set of routes that respond to the provider state handler endpoints, that also have access to the repository code and can manipulate state of the system.

Graphql without a server

I'd like to include an embedded graphql processor in my .net app.
I want to execute GraphQL functions via a method and I don't need the server or an endpoint.
Basically an internal GraphQL processor. I'll define the resolvers and the schema and then run queries via a method call. Is this possible?
You may take a look NReco.GraphQL (which is based on GraphQL.Net but has an additional features, like aggregation, filter, ect.). You can compose in code schema, query (or query object) and call it just in-code.

Can you use ServiceStack OrmLite's CaptureSqlFilter while still executing the commands?

Using ServiceStack ORMLite https://github.com/ServiceStack/ServiceStack.OrmLite I want to trace certain database calls with CaptureSqlFilter or some similar technique. However when you use this filter it captures the "intended" SQL but stops the commands actually being executed. This appears to be by design.
I want to use this or a similar technique to trace the ACTUAL calls made to the DB without stopping them.
Note that I want to do this in the code, I'm using SQL Azure so can't readily use SQL Profiler etc to achieve a similar result.
Thanks.
If you enable a Logger with Debug enabled it will log the SQL to your registered Logging Provider.
Otherwise you can also enable ServiceStack's built-in Mini Profiler which will provide access to the executed SQL.

Send POST request to Oracle APEX or PLSQL procedure

I'm trying to send a post request with parameters to Oracle APEX or PL/SQL procedure in the database. I'm currently exploring these avenues but to no avail. The PL/SQL method seems to involve using a DAD(Database Access Descriptor) to make a PL/SQL procedure web-enabled - however this sounds too complicated and not really ideal for the current environment where I am working in - need to edit the HTTP server and all that.
I'm not sure if Oracle APEX can accept a POST request and process that. Does anyone know of any solution of how either APEX could read a POST request or a PL/SQL procedure could be called using url and POST request parameters passed to it.
Sounds like you need to create a RESTful web service in APEX, assuming your client is able to make a HTTP request, which is kind of a given.
Have a look at this article. It should give you enough information to be able to create a REST API with the appropriate endpoint and POST handler.
http://www.modernapex.co.uk/building-a-todo-app-with-rest/

HTTP triggers for Postgres

I'm trying to write a Postgres trigger such that when a configuration table is updated, a backend component is notified and can handle the change. I know that Oracle has the concept of a web/HTTP trigger, where you can execute an HTTP GET from the Oracle instance itself to a URL that can then handle the request at the application layer. I'm wondering if Postgres (v. 9.0.5) has the same feature, or comes with anything similar (and, subsequently, how to set it up/configure it)?
You could call a Python stored procedure with PL/Python from your trigger and make your http get request using Python's standard libraries.

Resources