RAdwords Error in 1:ncol(data) : argument of length 0 - r

Using RAdwords, I can connect and get a token - list reports and list metrics. But when I try to pull data i get the error below the code - thanks for the help!
body <- statement(select=c('KeywordText','Clicks','Cost','Ctr'),
report="KEYWORDS_PERFORMANCE_REPORT",
where="Clicks > 100",
start="20150101",
end="20150301")
data <- getData(clientCustomerId='949-xxx-xxxx', google_auth=google_auth,
statement=body, transformation = TRUE)
e* upload completely sent off: 130 out of 130 bytes
HTTP/1.1 400 Bad Request
Content-Type: text/xml
Date: Fri, 06 Mar 2015 18:28:30 GMT
Expires: Fri, 06 Mar 2015 18:28:30 GMT
Cache-Control: private, max-age=0
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block
Server: GSE
Accept-Ranges: none
Vary: Accept-Encoding
Transfer-Encoding: chunked
Connection #0 to host adwords.google.com left intact
Error in 1:ncol(data) : argument of length 0

Update:
There is an update of the package on my github repository which might solve your problem:
https://github.com/jburkhardt/RAdwords
Could you please reinstall the package from github and see if your bug gets fixed?
Austin, referring to the R output you sent me per mail, your problem is:
> data <- getData(clientCustomerId='xxx-xxx-xxxx', google_auth=google_auth, statement=body)
* Hostname was NOT found in DNS cache
Please make sure that you use the Adwords Account Id. The MCC Id will not work!
This is not an error or bug of the RAdwords package per se. The problem rather has something to do with the curl settings on your system.
See here for similiar problems:
Curl Hostname was NOT found in DNS cache error
https://bbs.archlinux.org/viewtopic.php?id=175433

Related

Varnish returns MISS after successful HIT

My application uses Varnish 3.0.2. I am facing a weird problem here. Some of the times the pages are served from Varnish with a HIT. But immediately after it returns MISS.
I was under the impression that once it gets served from the cache, it will continue to do so until the TTL expires. Am I wrong in understanding that?
Here are the two response headers for both the scenario:
HIT
HTTP/1.1 200 OK
Server: Apache/2.4.16 (Unix) mod_auth_kerb/5.4 PHP/5.3.29
X-Powered-By: PHP/5.3.29
X-Drupal-Cache: MISS
Content-Language: en
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Type: text/html; charset=utf-8
cache-control: max-age=86400, public
X-Cookie-Debug: Request cookie:
X-Request-URL: /org/31633421?unit=31633421
Content-Length: 11986
Accept-Ranges: bytes
Date: Wed, 24 Apr 2019 14:26:43 GMT
X-Varnish: 330015711 330015651
Via: 1.1 varnish
Connection: keep-alive
X-Varnish-Cache: HIT
X-Varnish-Cache-Hits: 1
X-Varnish-Age: 188
X-Varnish-Leg: 128.87.225.172
X-Varnish-Cache-Version: 3.0.2
MISS
HTTP/1.1 200 OK
Server: Apache/2.4.16 (Unix) mod_auth_kerb/5.4 PHP/5.3.29
X-Powered-By: PHP/5.3.29
X-Drupal-Cache: MISS
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Cache-Control: public, max-age=300
Content-Language: en
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Type: text/html; charset=utf-8
X-Cookie-Debug: Request cookie: _gat_UA-15166137-36=1
X-Request-URL: /org/31633421?unit=31633421
Content-Length: 11978
Accept-Ranges: bytes
Date: Wed, 24 Apr 2019 14:23:52 GMT
X-Varnish: 1900997574
Via: 1.1 varnish
Connection: keep-alive
X-Varnish-Cache: MISS
X-Varnish-Age: 0
X-Varnish-Leg: 128.87.225.158
X-Varnish-Cache-Version: 3.0.2
I have tried to increase the TTL value, remove all the cookies (including Google Analytics) but still it's behaving abruptly.
Any idea why?
Update
Seems like this is happening for including the following Google Tag manager JS code in my view template.
<script>(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start':
new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0],
j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src=
'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f);
})(window,document,'script','dataLayer','GTM-XXX');</script>
Turns out it WAS actually a problem in the VCL configuration with the regex I was using. I had not considered the non-alpha characters of the Google Ananlytics cookie. Modified the regex to _[_\-\.\=a-zA-Z0-9] and everything is fun again!
Hope this helps somebody.
My guess is that it comes from two different varnish servers based on the two response headers:
X-Varnish-Leg: 128.87.225.172
and
X-Varnish-Leg: 128.87.225.158
+1 for Ronald, also please consider upgrading to the latest Varnish 6 as years have passed since Varnish 3, many bugs have been fixed and improvements built. Furthermore V3 is end of life.

Nexus Remote User Token authorization when deploying

I have a question regarding Nexus RUT capability. After setting it up, getting http header read, adding this user name to security.xml and mapping the role to this user, I am able to authorize in Sonatype Nexus GUI.
Question is, how can I authorize when trying to deploy artifact to Nexus repository using, lets say, Maven?
curl -I http://localhost:8080/nexus/service/local/status
returns
HTTP/1.1 401 Unauthorized
Date: Thu, 11 Jun 2015 10:41:59 GMT
Server: Nexus/2.11.3-01
X-Frame-Options: SAMEORIGIN
X-Content-Type-Options: nosniff
WWW-Authenticate: BASIC realm="Sonatype Nexus Repository Manager API"
Content-Length: 0
but
curl -I -H "X-Forwarded-User: admin" http://localhost:8080/nexus/content/
returns
HTTP/1.1 200 OK
Date: Thu, 11 Jun 2015 10:44:35 GMT
Server: Nexus/2.11.3-01
X-Frame-Options: SAMEORIGIN
X-Content-Type-Options: nosniff
Accept-Ranges: bytes
Last-Modified: Thu, 11 Jun 2015 10:44:35 GMT
Content-Length: 0
Using credentials in maven project and trying to deploy site after this tutorial, I am getting error
Uploading: .//project-summary.html to https://my.site.com/nexus/content/sites/site/
[WARNING] Required credentials not available for BASIC <any realm>#federation-sts.site.com:443
[WARNING] Preemptive authentication requested but no default credentials available
https://federation-sts.site.com/adfs/ls/?SAMLRequest=fZJdT4Mw........... - Status code: 200
Transfer finished. 5671 bytes copied in 0.404 seconds
Transfer error: java.io.IOException: Unable to create collection: https://my.site.com/nexus/; status code = 302
I would really appreciate your help, thanks!

Is the version of HTTP either 1.0 or 1.1 defined by webserver? How works the HTTP protocol definition?

I have a quick question but in advance I've read the RFC 2616 Chapter 14.22 about Host and HTTP Header but I still not understand where in httpd.conf or configuration file of a webserver should be changed? Please correct me if I'm wrong.
Look at following two HTTP GET I did to an Apache. The first one is GET for HTTP 1.0 , the other one is GET for HTTP 1.1. See the output:
HTTP/1.0 200 OK
Date: Thu, 24 Oct 2013 03:46:22 GMT
Server: Apache/1.3.41 (Unix) mod_gzip/1.3.26.1a PHP/5.2.9 mod_throttle/3.1.2 mod_psoft_traffic/0.2 mod_ssl/2.8.31 OpenSSL/0.9.8b
Vary: *
Last-Modified: Fri, 10 Aug 2012 20:22:30 GMT
ETag: "17c815b-3b-50256d86"
Accept-Ranges: bytes
Content-Length: 59
Connection: close
Content-Type: text/html
<html>
<body>
<center>webli7</center>
</body>
</html>
HTTP/1.1 400 Bad Request
Date: Thu, 24 Oct 2013 04:04:40 GMT
Server: Apache/1.3.41 (Unix) mod_gzip/1.3.26.1a PHP/5.2.9 mod_throttle/3.1.2 mod_psoft_traffic/0.2 mod_ssl/2.8.31 OpenSSL/0.9.8b
Connection: close
Transfer-Encoding: chunked
Content-Type: text/html; charset=iso-8859-1
16e
The HTTP protocol version is decided dynamicaly, not through configuration files. The client send a request specifying the highest protocol version that its support. Then, the server must respond with either the version requested by the client, or any earlier version that it prefers.
Since Apache does support HTTP/1.1, it should therefore match exactly the version provided by the client.
There exist a flag that you may set in Apache's config to force Apache to use HTTP/1.0 in certain situations, even though the browser requested HTTP/1.1. This is used to fix bugs in HTTP/1.1 handling of some very old browser. Today, you should not need to play with this flag.
As for your error, I would suggest that you make sure that your GET does provide the Host: header. This header is required in HTTP/1.1, yet optional in HTTP/1.0, and having it missing would certainly result in a 400 error.

Uploading a file to Google Docs Api getting error 504

I'm working on a delphi api for Google docs and having a hard time getting the upload to work. I'm following Google's development guide here and from what I understand it looks like the process should go like this:
Make a POST request to this url: https://docs.google.com/feeds/upload/create-session/default/private/full/?access_token=my_access_token&v=3&convert=false with these headers: X-Upload-Content-Type and X-Upload-Content-Length
Get a 200 OK response with the next upload location stored in the Location header
Make a PUT request to the Location header with the header Content-Type set to whatever I had X-Upload-Content-Type set to in step 1 and the header Content-Range set to something like this: bytes 0-524287/2097152 and the first 512kb of data in the body
Get a 308 Resume Incomplete Response that has the next upload location in the Location header
Go back to 3 until all bytes are uploaded, at which point I will receive a 201 Created response that will have the xml data describing the file I uploaded
Everything up to and including step 3 works fine. It is at step 4 that things start to go wrong.
The one thing that confuses me the most is that the response on step 4 doesn't contain a Location header. I figured that meant I should just send the next request to the same url, but that causes me to get a 504 error. I tried the entire process with fiddler just to see if it was the delphi code, a lack of understanding on my part, or something that google is doing.
Here's the requests and responses I sent and received using fiddler:
POST https://docs.google.com/feeds/upload/create-session/default/private/full/?access_token=my_access_token&v=3&convert=false HTTP/1.1
Content-Type: application/x-www-form-urlencoded
X-Upload-Content-Type: application/octet-stream
X-Upload-Content-Length: 2097152
Content-Length: 0
Host: docs.google.com
HTTP/1.1 200 OK
Server: HTTP Upload Server Built on May 16 2012 12:03:24 (1337195004)
Location: https://docs.google.com/feeds/upload/create-session/default/private/full/?access_token=my_access_token&v=3&convert=false&upload_id=AEnB2Ur9-9VxMSI6kaFzbybY2qiyzK6kVoKzcZ6Yo02H8Ni4FlQFl_N06DdjZXzp3vSjOPH3CEb_4vDlKZp7VlC0hxpkypzlKg
Date: Tue, 22 May 2012 16:53:27 GMT
Pragma: no-cache
Expires: Fri, 01 Jan 1990 00:00:00 GMT
Cache-Control: no-cache, no-store, must-revalidate
Content-Length: 0
Content-Type: text/html
PUT https://docs.google.com/feeds/upload/create-session/default/private/full/?access_token=my_access_token&v=3&convert=false&upload_id=AEnB2Ur9-9VxMSI6kaFzbybY2qiyzK6kVoKzcZ6Yo02H8Ni4FlQFl_N06DdjZXzp3vSjOPH3CEb_4vDlKZp7VlC0hxpkypzlKg HTTP/1.1
Content-Type: application/octet-stream
Content-Length: 524288
Content-Range: bytes 0-524287/2097152
Host: docs.google.com
[first 512kb of data here]
HTTP/1.1 308 Resume Incomplete
Server: HTTP Upload Server Built on May 16 2012 12:03:24 (1337195004)
Range: bytes=0-524287
X-Range-MD5: bd9d4ee7afa24b7da0e685f05b5f1f44
Date: Tue, 22 May 2012 16:54:29 GMT
Pragma: no-cache
Expires: Fri, 01 Jan 1990 00:00:00 GMT
Cache-Control: no-cache, no-store, must-revalidate
Content-Length: 0
Content-Type: text/html
PUT https://docs.google.com/feeds/upload/create-session/default/private/full/?access_token=my_access_token&v=3&convert=false&upload_id=AEnB2Ur9-9VxMSI6kaFzbybY2qiyzK6kVoKzcZ6Yo02H8Ni4FlQFl_N06DdjZXzp3vSjOPH3CEb_4vDlKZp7VlC0hxpkypzlKg HTTP/1.1
Content-Type: application/octet-stream
Content-Length: 524288
Content-Range: bytes 524288-1048575/2097152
Host: docs.google.com
[next 512kb of data]
HTTP/1.1 504 Fiddler - Send Failure
Content-Type: text/html; charset=UTF-8
Connection: close
Timestamp: 10:54:14.056
The only thing I was able to do was to be able to say for a fact that it is not just the delphi code that is wrong, and since I don't think it's google, I'm going to have to go with I don't understand something that should be happening. What am I missing?
Edit
I was able to get the upload working, I'm not entirely sure what I did differently, but the documentation is a little misleading. At least it is to me. When you send a PUT request, you don't get a new location, you just continue to upload to the same one. Also, when you finish the upload, the 201 response doesn't contain the actual XML data, instead, it has a Location header that points to where you can grab the XML data from. Not a huge deal but a little confusing.
It seems like the 504 error is returned by Fiddler, these two links should help:
https://urda.com/blog/2010/09/28/iis-services-504s-and-fiddler/
https://urda.com/blog/2010/09/30/follow-up-iis-services-504s-and-fiddler/

Chrome MULTIPLE_CONTENT_LENGTH error

If I access my page directly, I get:
$ wget http://localhost:8010/ --save-headers -O -
--2010-10-29 18:30:24-- http://localhost:8010/
Resolving localhost (localhost)... 127.0.0.1
Connecting to localhost (localhost)|127.0.0.1|:8010... connected.
HTTP request sent, awaiting response... 200 OK
Length: 950 [text/html]
Saving to: `STDOUT'
HTTP/1.1 200 OK
Server: gunicorn/0.11.1
Date: Fri, 29 Oct 2010 16:30:24 GMT
Connection: keep-alive
Vary: Accept-Language, Cookie, Accept-Encoding
Content-Length: 950
Content-Type: text/html; charset=utf-8
Content-Language: en-us
If I access it via the cache:
$ wget http://localhost:8000/ --save-headers -O -
--2010-10-29 18:30:31-- http://localhost:8000/
Resolving localhost (localhost)... 127.0.0.1
Connecting to localhost (localhost)|127.0.0.1|:8000... connected.
HTTP request sent, awaiting response... 200 OK
Length: 950 [text/html]
Saving to: `STDOUT'
HTTP/1.1 200 OK
Server: gunicorn/0.11.1
Vary: Accept-Language, Cookie, Accept-Encoding
Content-Type: text/html; charset=utf-8
Content-Language: en-us
Content-Length: 950
Date: Fri, 29 Oct 2010 16:30:31 GMT
X-Varnish: 818233557
Age: 0
Via: 1.1 varnish
Connection: keep-alive
When I open the latter in Chromium (8.0.552.18 (0)), I get this error:
Error 346 (net::ERR_RESPONSE_HEADERS_MULTIPLE_CONTENT_LENGTH): Unknown error.
I only see three extra headers; which one should I remove to make it display in Chrome?
EDIT: I have eventually got rid of this problem, but I can't remember how, and I don't have access to that system anymore. I'm starting a bounty, maybe somebody will explain me what was going on here.
Check out this version of the chromium source. It looks like if you do not specify "Transfer-Encoding" and you include multiple lengths it will throw this very error. Later revisions added a check that the content length sizes must be different to throw this error. Seems like it was added as a security precaution.
Probably would not have ever seen this error with a newer version of Chromium.
You might try disabling the DNS prefetching in the Chromium settings. Go to Preferences > Under the Hood and un-check "Use DNS pre-fetching to improve page load times".

Resources