Setting a timeout for a javascript-based service callout - apigee

We have a callout to another API within a javascript policy:
var calloutresponse, status, headers = {
'key':context.getVariable("request.header.key")
};
var myRequest = new Request(url, "POST", headers, data);
var exchange = httpClient.send(myRequest);
exchange.waitForComplete();
var calloutResponse = exchange.getResponse();
context.setVariable("calloutstatus",calloutResponse.status.code);
context.setVariable("calloutresponse",calloutResponse.content);
There are times when this callout takes an exceptionally long time and we would like to be able to set a timeout limit for it (like one would a target endpoint) and have the calloutResponse.status.code be a 503.
Is there a value which can be set for either the httpClient or Request to set this? I have looked through the Apigee documentation as well as here and can't find anything.

In the xml policy that calls the script, there's a timeLimit property that defaults to 200ms. You can change this at will.

Related

OData without HTTP in .NET Core

I would like to utilize oData without HTTP.
Scenario: "Load Balancer" oData-capable service receives requests from
clients, puts them into a queue (in serialized form), then "background" workers pick up
messages from the queue and process the request (get data from data
storage and provide response)
And it seems that such functionality either it is not exposed in MS oData libs, or it is so simple and obvious (though not for me) that nobody cares to highlight it in the docs.
I see it something like (pseudo-code)
var model = GetEdmModel();
var processor = GetProcessor(); // something like ODataController in AspNet.oData - contains functions and whatever
var request = GetRequestFromQueueAndParse();
var uriPath = request.Path; // like "/Books"
var queryString = request.QueryString; // like "$filter=price lt 50"
var method = request.Method; // GET or POST or ...
var body = request.Body;
// *** here is what I am looking for ***
var response = SomeMagicODataHelper.ProcessQuery(model, processor, method, uriPath, queryString, body);
ProvideResponseBackToBalancer(response);
Is there something alike provided by (preferable) standard MS oData library, or as a third-party library?

GA Enhanced Ecommerce Missing Purchase Events

We are sending the purchase events from the server with code like this:
using (var httpClient = new RestClient())
{
httpClient.SendAsync(new HttpRequestMessage
{
RequestUri = new Uri(url),
Method = HttpMethod.Get
});
}
But around 15-20% of the events never gets registered in GA.
Google always seem to respond with a GIF and status code 200, so it is hard to tell which events are not processed successfully.
In the beginning we were using the javascript API to send the event, but when we switched to server side, we copied the request it was creating and tried to replicate it with HttpClient.
The request send looks like the following:
https://www.google-analytics.com/collect?v=1&_v=j47&a=817546713&t=event&ni=0&_s=1&
dl=#scheme + host + pathAndQuery#&dp=#path#&dt=#path#&ul=#browser language#&de=#browser encoding#&sd=#bit#&sr=#screen resolution#&vp=#viewable browser area#&cid=#Id taken from the _ga cookie#&je=0&fl=24.0%20r0&ec=Ecommerce&ea=purchase &_u=SCEAAAALI20%25~&jid=&tid=#TrackingId#&gtm=#TagManagerId#&ti=#OrderId#&ta=&
tr=#TotalPrice#&tt=#TotalTax#&ts=#ShippingPrice#&tcc=#VoucherCode# &pa=purchase&cu=#CurrencyCode#&pr1nm=#ProducteName#&pr1id=#ProductId#&pr1pr=#ProductPrice#&pr1br=#Brand#&pr1ca=&pr1va=#Variant#&pr1qt=#Quantity#&z=#Randomly generated unique id#
Any ideas about what is wrong or how to debug it is welcome
You shouldn't do that on the backend. The correct way is to do that on the frontend
The easiest and the correct way is to send data to your dataLayer and then in GTM send an event to GA.
P.S. In your C# code I can see the problem that you are not awaiting async method. If your method is not async, then you can use it like that:
var temp = httpClient.SendAsync(new HttpRequestMessage
{
RequestUri = new Uri(url),
Method = HttpMethod.Get
}).Result;

How to create persistent header in MVC web API

I am working on angularJS project right now, and on login of a user want to set my custom header value with UserID and name to be presistent. IS it possible to presist header enven after browser restart?
one of my last tries was this code:
var tokenIdentity = new AuthCacheManager().Authenticate(loginName, password);
// HttpContext.Current.Response.AppendHeader("Last-Update", "AuthToken");
response = Request.CreateResponse(HttpStatusCode.OK, tokenIdentity);
response.Content = new StringContent("asdasd", Encoding.GetEncoding("ISO-8859-1"), "text/xml");
response.Headers.Add("AuthToken", tokenIdentity.ToString());
response.Content.Headers.Expires = DateTimeOffset.Now.AddMinutes(10.0);
response.Content.Headers.Add("Content-Length", "{ab:a}");
response.Headers.Add("Connection", "Keep-Alive");
response.Headers.Add("Keep-Alive", "timeout = 20000 max = 100");
return response;
But it didn't work at all.
Does anyone know how to make a header persistent since what i am doing now is only alive for one request?

Flex cachy problem

I am querying to server through flex,first time its show the result but when I insert a new record and query next time,its shows previous results only(problem facing in IE but not in chrome).
You can parametrize your http(?) request, and by setting an always changing parameter, you can make sure that your response never gets read from cache.
In the examples below I use a parameter with the name nocache for this task:
You can set the nocache parameter in your url string:
var url:String = "http://data.your.server?nocache=" + new Date().getTime();
Or -if you use a URLRequest, you can set it inside its data member:
//the url from where you get the data
var url:String = "http://data.your.server";
var urlVars:URLVariables = new URLVariables();
urlVars.nocache = new Date().getTime();
//set the other parameters (if any)
//attach the parameter list to your request
var request:URLRequest = new URLRequest(url);
request.data = urlVars;
Update
Here the new Date().getTime() will return the system's current time in milliseconds, so this way you can be sure, that it won't get called with this parameter value again.
That's because IE caches your request.
Add a random query string parameter to the remote URL you use, like http://myserver.com/fetch_data?random=4234324
(And by random I don't mean use 4234324 all the time, use actionscript to generate a random number and append it to the url)
See this KB from adobe

RIA Services, Forms Authentication and extra cookies

I have an Silverlight 4 RIA Services application with custom Forms Authentication. The custom authentication service works like a charm.
The problems is I want to serialize the user object in a cookie which is then sent with each subsequent request.
I create the cookie and add it to the response cookie collection but on the next request the only cookies in the cookie collection are ASPXAUT and ASPX_SESSIONId, of the custom cookie not a trace.
This is the cookie management class:
public class CookieManager:ISessionManager
{
public object this[string key]
{
get
{
var context = getCurrentContext();
var cookie = context.Request.Cookies[key];
if (cookie == null) return null;
return deserialize(cookie.Value);
}
set
{
var context = getCurrentContext();
string cookieValue = serialize(value);
HttpCookie cookie = new HttpCookie(key, cookieValue);
cookie.Expires = DateTime.Now.AddDays(10000);
cookie.HttpOnly = false;
context.Response.Cookies.Remove(key);
context.Response.Cookies.Add(cookie);
}
}
public void Abandon()
{
var context = getCurrentContext();
context.Response.Cookies.Clear();
}
public void Clear()
{
Abandon();
}
private HttpContext getCurrentContext()
{
return HttpContext.Current;
}
private string serialize(object value)
{
MemoryStream stream = new MemoryStream();
BinaryFormatter formatter = new BinaryFormatter();
formatter.Context = new StreamingContext(StreamingContextStates.Clone);
formatter.Serialize(stream, value);
StreamReader reader = new StreamReader(stream);
stream.Position = 0;
string result = reader.ReadToEnd();
reader.Close();
stream.Close();
return HttpUtility.UrlEncodeUnicode(result);
}
public object deserialize(string value)
{
value = HttpUtility.UrlDecode(value);
MemoryStream stream = new MemoryStream();
StreamWriter writer = new StreamWriter(stream);
writer.Write(value);
BinaryFormatter formatter = new BinaryFormatter();
return formatter.Deserialize(stream);
}
}
It reads and saves cookies.
Now my problem is this:
What I need to enable in silverlight or in the ASP.NET (WCF) application in order for extra cookies to be sent with each request along side the authentication cookie.
EDIT:
I've inspected the HTTP request/response stack and those extra cookies are sent from the server with the WCF RIA Services response but not returned by the next service call from the client.
If I understand your edit above correctly, you've already inspected the HTTP requests and found the desired cookie present in the HTTP Set-Cookie header of the response, but missing in the Cookie header of the next request. Is this correct? If not, please clarify.
If so, the problem sounds like one of three things:
the client is not successfully saving the cookie, due to many possible reasons including:
cookie not properly formatted (unlikley)
cookie is too long
there's a client- or server-side policy (e.g. P3P) preventing saving persistent cookies.
The client is saving the cookie OK, but is not sending it back, even without Silverlight. This could be caused by, for example, a security issue where the hostname of the first request is different from the second.
The client is saving the cookie and can send it back over regular HTML pages, but not via HTTP requests sent by Silverlight.
To see if #1 is the problem, look (using your browser's ability to view cookies) at the cookies saved by your browser for that site. Is the expected cookie saved? If it is, then you can eliminate #1 as the problem. If it's not saved, start looking
To see if #2 is the problem, try creating a server-side page with no silverlight on it-- just a simple HTML page. When you visit that page with your browser, is the cookie sent as expected? If yes, then #2 is not your problem.
If #1 and #2 are not the problem, that leaves #3. Silverlight's HTTP handling is complicated, not least because you have to choose between having HTTP client requests handled by the browser or by Silverlight. Read the Silverlight cookies documentation carefully and see if any of the info therein will help you figure out the problem. Consider trying to use the "Client HTTP" setting, or if you're already using this, consider switching back to the "browser HTTP" setting and see if your problem goes away. Note that the Client HTTP setting apparently has a problem with losing new cookies after an HTTP redirect. See this thread for more info. There's a workaround discussed in that thread: using CookieContainer.
BTW, could you edit your question to include all the HTTP headers of the request and the subsequent request? This may help diagnosis.

Resources