Forcing ttl's in varnish 6.2 - varnish-vcl

I've been looking at the latest version of Varnish (6.2) and having problems with the removal of return(miss) from vcl_hit.
So our use case is that we want to cache things for a set amount of time, then force varnish to retrieve new content, in previous versions the following has worked fine
sub vcl_hit {
if (obj.ttl >= 0s) {
return (deliver);
}
else
{
return (miss);
}
}
However in 6.2 return(miss) has been removed, we want to force content to always be refreshed correctly.
I looked at return(pass) but from the documentation this suggests that the response will not be cached, which is not what we want.
and return(fetch) has not been an option for some time, but I'm struggling to find an alternative? As return(restart) suggested in the docs will just loop back to the same place.
Should I be looking elsewhere, and trying to disable grace/saint?

vcl_hit is the wrong subroutine for a handful of reasons, the main one you are using a complicated way, with side-effects, to do something trivial. Just do:
sub vcl_backend_response {
# set the ttl
set beresp.ttl = 5m;
# after ttl is gone, grace kicks in during which
# content is revalidated asynchronously
set beresp.grace = 2h;
# after grace, keep kicks during which
# content is revalidated synchronously
set beresp.keep = 3d;
}
Your snippet is equivalent to setting grace and keep to 0s.

Related

Caching a resource in Varnish-Cache after specific number of requests

Is there any functionality like proxy_cache_min_uses of Nginx in the Varnish-Cache that caches a resource after specific number of requests to that resource?
Here is some similar solution in Varnish-Cache plus (based on slimhazard's comment on this issue):
import vsthrottle;
sub vcl_recv {
if (req.url ~ "^/min/use/me" && vsthrottle.is_denied(req.url, 50, 2h, 1h) {
# If the URL was requested more than 50 times during the last two hours,
# then go to cache lookup for the next hour.
return (hash);
}
else {
# Otherwise bypass the cache
return (pass);
}
}
Is there any similar solution that could be used in Varnish-Cache itself?
Not in Varnish Cache core in itself, but you can achieve this with a VMOD, like this counter VMOD.
It will allow you to increment some counter for a number of times a resource was requested and then check its value and apply caching logic required.

ASP. NET Session variable expires much faster

I set a session variable at login:
HttpContext.Current.Session["user_key"] = res; //being some string eg: "asd"
HttpContext.Current.Session.Timeout = 60;
Just in case i also have
<system.web>
<sessionState timeout="60"></sessionState>
Then i need to check for the user and get some date for their ID on pretty much every page and on every Page_load:
if(HttpContext.Current.Session["user_key"]!= null)
{
sesvar = (string)(context.Session["user_key"]);
}
else
{
HttpContext.Current.Response.Redirect("/login/");
}
This works for the most part. But it is definitely not 60mins. I'd get "kicked" (redirected to login) every now and then and can't figure out why.
Also the project is worked on and maintained trough Dreamweaver. Being a WebSite it is not compiled in any way and is live on IIS Server.
It turned out to be a function in our Database ruining every hour which "cleaned" the login table, where it shouldn't have.

How do I prevent hotlining but allow google in nginx?

I want to setup a reverse proxy for serving images stored in S3.
I dont want to allow access to images if referrer is not example.com
But I want to allow multiple crawlers for example google bot, bing bot etc (based on user_agent) to access the images.
I also want to allow my android app to access the images (based on custom header say X-Application: ExampleApp)
How do I configure nginx to do so ?
That comes to using 3 IF's which is not going to work due to IF limitations.
What you can do is 2 things, set up MAP to deal with the 3 tests (setting true or false values) then inside the server block use Lua to combine the 3 test values into one and use a single IF (or pure Lua) in the location block to allow/deny access.
map $referrer $usestring1 {
default 0;
~^google$ 1;
}
map $user_agent $usestring2 {
default 0;
~^google$ 1;
}
etc....
location / {
content_by_lua '
local s = ngx.var.usestring1;
local t = ngx.var.usestring2;
if s+t == 2 then return ngx.exit(503); end;
';
etc...........

IE response.redirect

I ran into an extremely odd issue with IE today. IE fails every time I try to do a response.redirect more than ten times! Of course, the page works fine in FF and Chrome. Has anyone else experienced something like this?
Here are some code snippets to make sure I am not doing anything blatantly wrong...
Loop
if ( iDomain < ubound(aDomain) ) then
Response.Redirect "/home/login/a_logout.asp?site=" & strSite & "&domain=" & iDomain+1 & "&l=" & ilogout & "&s=" &sSid
end if
Array
Dim aDomain(10)
aDomain(0) = ".x.com"
aDomain(1) = "www.x.com"
aDomain(2) = "w1.x.com"
aDomain(3) = "w2.x.com"
aDomain(4) = "x.com"
aDomain(5) = "w3.corporate.x.com"
'aDomain(5) = "w4.x.com"
aDomain(6) = "w5.x.com"
aDomain(7) = "w6.x.com"
'aDomain(8) = ""
'aDomain(9) = "w8.x.com"
aDomain(8) = "w9.x.com"
aDomain(9) = "w10.x.com"
Removed context sensitive data.
Let me know if you need any other info. Thanks!
This is the default behaviour to prevent a user from being looped back to the same page infinitely.
IE8s limit is 10 requests to the same page, Chrome and FireFox I believe are 20.
And no, a different querystring doesn't constitute a new page as I found out myself.
I would highly suggest that you change this. Redirecting multiple times is a pretty bad idea.
Instead, just run whatever code is being run by your a_logout page locally. I'm assuming your clearing several cookies. Go ahead and resend all of the appropriate cookies with blank data and an expires yesterday time.
Redirecting so often is blatantly wrong. The ideal maximum number of redirects is 1. In practice it can be a lot easier to do certain tasks if you allow for more than that, but anywhere more than 5 redirects happen should be considered a bug (more than 1 on the same server or more than 3 that crosses to another server should be considered sub-optimal, but not urgent to fix).
Browsers can't depend upon servers never doing anything blatantly wrong, so after a few goes they give up to save the user from the server. Sometimes user-agents don't protect themselves in this way (not serious browsers, but it's an easy mistake to make writing a simple piece of HTTP client code). It isn't pretty.
To demonstrate just how bad this can be, consider a case where the handler for /somePath/?id=1 redirects to /somePath/?id=2 which redirects to /somePath/?id=3 and so on. For all the server knows, you've just got a more obscure version of that, and will never stop redirecting.

Create a timed cache in Drupal

I am looking for more detailed information on how I can get the following caching behavior in Drupal 7.
I want a block that renders information I'm retrieving from an external service. As the block is rendered for many users I do not want to continually request data from that service, but instead cache the result. However, this data is relatively frequent to change, so I'd like to retrieve the latest data every 5 or 10 minutes, then cache it again.
Does anyone know how to achieve such caching behavior without writing too much of the code oneself? I also haven't found much in terms of good documentation on how to use caching in Drupal (7), so any pointers on that are appreciated as well.
Keep in mind that cache_get() does not actually check if an item is expired or not. So you need to use:
if (($cache = cache_get('your_cache_key')) && $cache->expire >= REQUEST_TIME) {
return $cache->data;
}
Also make sure to use the REQUEST_TIME constant rather than time() in D7.
The functions cache_set() and cache_get() are what you are looking for. cache_set() has an expire argument.
You can use them basically like this:
<?php
if ($cached_data = cache_get('your_cache_key')) {
// Return from cache.
return $cached_data->data;
}
// No or outdated cache entry, refresh data.
$data = _your_module_get_data_from_external_service();
// Save data in cache with 5min expiration time.
cache_set('your_cache_key', $data, 'cache', time() + 60 * 5);
return $data;
?>
Note: You can also use a different cache bin (see documentation links) but you need to create a corresponding cache table yourself as part of your schema.
I think this should be $cache->expire, not expires. I didn't have luck with this example if I'm setting REQUEST_TIME + 300 in cache_set() since $cache->expires will always be less than REQUEST_TIME. This works for me:
if (($cache = cache_get('your_cache_key', 'cache')) && (REQUEST_TIME < $cache->expire)) {
return $cache->data;
}

Resources