Setup environment variables imported from Swagger in Paw - paw-app

I receive API via swagger format with dynamic variables:
"/tour/{tour_id}/": {
// ...
},
Swagger-importer extension imports these variables like this:
It looks like environmental variable and Paw creates random value for request, but looks like it's unable to create environment variable for this.
I plan to receive API from backend developers via swagger export files and don't want to change requests because they will be overwritten with every import. It would be great to import API as is from backend and just edit environment variables.

The Swagger / Open API Initiative does not specify that parameters sharing the same name are the same parameter, and it makes sense as the constraints on each parameter could be different based on the operation being done. Therefore, Paw cannot merge parameters together into an environment variable.
Swagger does define a root level field named parameters, which can be used to share parameters between requests, and the Swagger Importer could move these into environment variables, but that's as far as it can go, considering the swagger spec.
This would be nice to have, but that feature is not on the roadmap for next month, although it will probably be introduced later on (I created an issue on github to remember it)

Related

How to properly mock API results for testing with NextJS and MSW?

First of all, if I should ask this question elsewhere please let me know, as I'm not sure where it belongs.
I have a working application with NextJS and MSW, where I can capture the server-side API requests with MSW and return mocked JSON results.
The issue I have is that there are probably about 15 API calls that I need to mock in order to test my application properly
While I could just call each one of these manually, copy and paste the results into a file and then just return that data from the file when I capture the API call, this not a very scalable solution, especially if the back-end API changes.
Question: What are your best methods for automating the generation of these results?
In the past I have created JSON files that have all of the URL paths and query parameters explicitely listed, and I would parse through this file and query every endpoint, and then I had template files which would be used to re-populate my fixtures directory with all of the mocked responses, however this was also very cumbersome.
(For reference, the API has a somewhat similar structure to this one: https://api.gouv.fr/documentation/api-geo, where there are multiple endpoints for fetching data, and each endpoint supports a number of different query parameters to tweak the call.)

How to access dependency injection container in Symfony 4 without actual injection?

I've got a project written in Symfony 4 (can update to the latest version if needed). In it I have a situation similar to this:
There is a controller which sends requests to an external system. It goes through records in the DB and sends a request for every row. To do that there is an MagicApiConnector class which connects to the external system, and for every request there is a XxxRequest class (like FooRequest, BarRequest, etc).
So, something like this general:
foreach ( $allRows as $row ) {
$request = new FooRequest($row['a'], $row['b']);
$connector->send($request);
}
Now in order to do all the parameter filling magic, the requests need to access a service which is defined in Symfony's DI. The controller itself neither knows nor cares about this service, but the requests need it.
How can my request classes access this service? I don't want to set it as a dependency of the controller - I could, but it kinda seems awkward, as the controller really doesn't care about it and would only pass it through. It's an implementation detail of the request, and I feel like it shouldn't burden the users of the request with this boilerplate requirement.
Then again, sometimes you need to make a sacrifice in the name of the greater good, so perhaps this is one of those cases? It feels like I'm "going against the grain" and haven't grasped some ideological concept.
Added: OK, the full gory details, no simplification.
This all is happening in the context of two homebrew systems. Let's call them OldApp and NewApp. Both are APIs and NewApp is calling into the OldApp. The APIs are simple REST/JSON style. OldApp is not built on Symfony (mostly even doesn't use a framework), the NewApp is. My question is about NewApp.
The authentication for OldApp APIs comes in three different flavors and might get more in the future if needed (it's not yet dead!) Different API calls use different authentication methods; sometimes even the same API call can be used with different methods (depending on who is calling it). All these authentication methods are also homebrew. One uses POST fields, another uses custom HTTP headers, don't remember about the third.
Now, NewApp is being called by an Android app which is distributed to many users. Android app actually uses both NewApp and OldApp. When it calls NewApp it passes along extra HTTP headers with authentication data for OldApp (method 1). Thus NewApp can impersonate the Android app user for OldApp. In addition, NewApp also needs to use a special command of OldApp that users themselves cannot call (a question of privilege). Therefore it uses a different authentication mechanism (method 2) for that command. The parameters for that command are stored in local configuration (environment variables).
Before me, a colleague had created the scheme of a APIConnector and APICommand where you get the connector as a dependency and create command instances as needed. The connector actually performs the HTTP request; the commands tell it what POST fields and what headers to send. I wish to keep this scheme.
But now how do the different authentication mechanisms fit into this? Each command should be able to pass what it needs to the connector; and the mechanisms should be reusable for multiple commands. But one needs access to the incoming request, the other needs access to configuration parameters. And neither is instantiated through DI. How to do this elegantly?
This sounds like a job for factories.
function action(MyRequestFactory $requestFactory)
{
foreach ( $allRows as $row ) {
$request = $requestFactory->createFoo($row['a'], $row['b']);
$connector->send($request);
}
The factory itself as a service and injected into the controller as part of the normal Symfony design. Whatever additional services that are needed will be injected into the factory. The factory in turn can provide whatever services the individual requests might happen to need as it creates the request.

Use of Inbuilt rest end point to invoke module in ML database

I am using inbuilt rest end point in Marklogic that allow me to call modules in stored in module database in Marklogic.
http://localhost:8000/LATEST/invoke?data-urlencode=module=/modules/module.xqy&database=databasename&data-urlencode=vars='{"word1":"hello","word2":"world"}'
Does it also provide any option to call direct function present within lib module?
Using vars option it allows us to pass external parameter to the invoking modules. It seems that vars option only allow to pass primitive values to external parameter to invoking module.
But how we can use this vars option to pass XML data to invoking module so that it can be access through external variable defined within module.
Any suggestion would be appreciated.
Note : I am using postman for testing of rest API.
Many thanks.
Since your goal is to get to a library function, consider creating a REST extension instead of using /invoke with a main module. A REST extension can implement your choice of HTTP verbs and accept input in whatever for you'd like. The extension can then convert those inputs to function parameters and call the function.
For more information about REST extensions, see Extending the REST API, which includes an example XQuery extension.

Adding correlation id to automatically generated telemetry with App Insights

I'm very new to Application Insights, and I'm thinking of using it for a set of services I plan on implementing with asp.net webapi. I was able to get the basic telemetry up and running very easily (right-clicking on a project on VS, Add Application Insights), but then I hit a block. I plan to have a correlation id set in the request headers for calls to downstream services, and I would like to tag all the telemetry related to one outside call with the same correlation id.
So far I've found that there is a way to configure a TelemetryInitializer, but if I understood correctly, this is run before I get to access the request, meaning I can't check if there is a correlation id that I should attach.
So I guess there might be 2 ways to solve this: 1) if I can somehow actually get access to the request headers before the initializer, that would obviously solver the problem, or 2) somehow get a hold of the TelemetryClient instance that is used to report the automatically generated telemetry.
Perhaps the last resort would be to turn off all of the automatic stuff and do all of it manually, when I could of course control what properties are set on the TelemetryClient. But this would be quite a lot more work, so I'd prefer to find some other solution.
You were rights saying that you should use TelemetryInitializer. All TelemetryInitializers are called when Track method is called on any telemetry item. Autogenerated request telemetry is "tracked" on request OnEnd, you should have all your custom headers available for you at that time.
Please also have a look at OperationId - this is part of the standard context managed by App Inisghts and is used exactly for the purpose of correlating requests with downstream execution. This is created and passed automatically, including traces (if you use trackTrace).
Moreover, we have built-in support in our UX for easily seeing all telemetry for a particular operation - it can be found in "Search->Details-->Related Items-->All telemetry for this operation"

Symfony 2 - How can I share data between controllers

I need to be able to make some request data from one controller available in another controller. I can make a service to set the data in one controller, but when the other controller fires and I get the service, a new instance of the service is created. Is there any way I can make this data static and share it between two controllers?
The same basic things you would do whenever you need information to be available in PHP from a new request:
Store it in the session. Symfony2 has a great session component for this. Ideal for fleeting data that needs to be saved only while the user is navigating
Store it in the database. Symfony2 supports Doctrine which makes this very easy. Ideal for permanent storage
Optionally:
Store it on the filesystem. Not recommended unless it's actually a file, but possible as well.
In the end, rather than using the session to store data, I created two separate routes to the same controller action. I added an optional argument in the controller action, with a default value only specified in one of the routes. I can then test for that argument's value when the controller runs. In the Twig template that calls this controller action, the path can be generated using either one of these routes, depending on a variable already available.
Bit of a work around, but problem solved!

Resources