Out of now where I'm getting this 404 error when browser is requesting the jquery.min.map.
Funny this is that I've never added this file to my solution.
Can anyone explain to me how to get ride of this error?
I have no idea where this file is being referenced since I did not add a reference to this file.
Request URL:http://localhost:22773/Scripts/jquery.min.map
Request Method:GET
Status Code:404 Not Found
Request Headersview source
Accept:*/*
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-US,en;q=0.8
Connection:keep-alive
Host:localhost:22773
Referer:http://localhost:22773/Manager/ControlPanel.aspx
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.62 Safari/537.36
Response Headersview source
Cache-Control:private
Content-Length:4992
Content-Type:text/html; charset=utf-8
Date:Tue, 10 Sep 2013 17:37:40 GMT
Server:Microsoft-IIS/8.0
X-Powered-By:ASP.NET
X-SourceFiles:=?UTF-8?B?YzpcdXNlcnNcYWRtaW5pc3RyYXRvclxkb2N1bWVudHNcdmlzdWFsIHN0dWRpbyAyMDEyXFByb2plY3RzXEFsdW1DbG91ZFxBbHVtQ2xvdWRcU2NyaXB0c1xqcXVlcnkubWluLm1hcA==?=
Source maps are like favicons, a thing that will be loaded by browsers in some circumstances.
Typically, javascript are minified on production servers and debugging javascript on them is difficult.
Source maps are the original versions of minified javascript. It's up to the developers to include them or not on their websites.
In Chrome, you have to activate this functionality for the browser to attempt to download the original non-minified version of a minified script. It is then easier to debug client-side.
Basically, you can't get rid of this error besides providing source maps.
Anyways, see: http://www.html5rocks.com/en/tutorials/developertools/sourcemaps/
I just fixed this in my own app.
Files that you copy from a CDN often have the sourcemap line at the bottom. For axios.min.js, it's
//# sourceMappingURL=axios.min.map
Just remove that line and you won't get that error. Better still, use the version they provide for local loading.
I came across this when developing something without reliable internet access, so I needed the local version. Removing that line solved the problem.
Related
Im trying to get JSON from webapi to R
From the following website
http://wbes.srldc.in/Report/GetCurrentDayFullScheduleMaxRev?regionid=4&ScheduleDate=31-05-2020
When i try in browser, getting a proper response.
However when i try in R using any method, the fetched data is of a different html page without any JSON format.
Im just starting with R without any programming background. Please help
I feel your pain, it seems the website you're trying to access checks for user-agents to avoid scraping. I'd set a common user agent in httr (Chrome or Firefox are fine) and perform a single GET request. This should get you through scraping the MaxRevision value:
library(httr)
url <- "http://wbes.srldc.in/Report/GetCurrentDayFullScheduleMaxRev?regionid=4&ScheduleDate=31-05-2020"
ua <- "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.169 Safari/537.36"
res <- GET(url, user_agent(ua))
content(res)
#> $MaxRevision
#> [1] 216
Created on 2020-06-01 by the reprex package (v0.3.0)
I couldn't find any function in Qt 5 to determine which chromium version is used by QtWebEngine.
I don't want to hard-code the chromium version in my code because I frequently update my application and the chromium version is usually changed in each version. And also Qt is backward-compatible and it is possible to update it without updating my application.
There is no direct solution but looking at the source code You can see that it is used to set the default user agent:
std::string ContentBrowserClientQt::getUserAgent()
{
// Mention the Chromium version we're based on to get passed stupid UA-string-based feature detection (several WebRTC demos need this)
return content::BuildUserAgentFromProduct("QtWebEngine/" QTWEBENGINECORE_VERSION_STR " Chrome/" CHROMIUM_VERSION);
}
So it can be extracted from that data:
QString version;
QString user_agent = QWebEngineProfile::defaultProfile()->httpUserAgent();
for(const QString & text : user_agent.split(" ")){
if(text.startsWith(QStringLiteral("Chrome/"))){
version = text.mid(QStringLiteral("Chrome/").length());
}
}
qDebug().noquote()<< "Qt version:" << QT_VERSION_STR << "chromium version:" << version;
Output:
Qt version: 5.14.2 chromium version: 77.0.3865.129
You can use the runJavaScript method to show a dialog box with the navigator.userAgent Javascript variable, which includes the Chromium version:
QWebEngineView webView;
webView.page()->runJavaScript("alert(navigator.userAgent)");
In my case, the alert box says the following:
Mozilla/5.0 (Windows NT 6.2; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) QtWebEngine/5.15.1 Chrome/80.0.3987.163 Safari/537.36
The interesting part for finding the Chromium version is Chrome/80.0.3987.163, which means that the Chromium version the Qt Web Engine is using is 80.0.3987.163.
If you run QT with the devtools allowed, you can get the chrome version inside the devtools that way :
navigator.appVersion.match(/.*Chrome\/([0-9\.]+)/)[1]
This solution comes from this stack overflow answer.
I found this wonderful code on GitHub (https://github.com/rpodcast/nhl_analysis/blob/master/web-scraping/hockey-reference-boxscore-scratch.R), as I am new to R and more familiar to matlab, my goal was just to use the code to get the data I want. I just copied the code from his github, i imported every possible package.
After executing the code in RStudio, i get this problem:
table.stats <- readHTMLTable(full.url, header=FALSE)
Error: failed to load external entity "http://www.hockey-reference.com/boxscores/199511210BOS.html"
I tried to solve the problem with other Q&A from here, but wasnt able to. I tried to rewrite it using the httr-package instead of the RCurl package, but this doesnt work.
I really appreciate your help.
The codes you're using are last updated 7 years ago. And websites frequently change their HTML design, so codes are not guaranteed to work.
Use the following codes instead.
library(rvest)
library(httr)
ua <- user_agent("Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/76.0.3809.132 Safari/537.36")
url <- 'https://www.hockey-reference.com/boxscores/199511210BOS.html'
session <- html_session(url,ua)
session %>%
html_nodes("table") %>%
html_table()
I'm running some tests with PhantomJS / CasperJS on Ubuntu and Google Analytics, and i'm having problems with GA to correctly recognize my language settings that i'm sending in HTTP Request Headers.
No matter what i enter in my Accept-Language header, i end up with GA classifying the language as "c".
I'm sure my Accept-Language headers are correct, here's an example:
ACCEPT-ENCODING:gzip, deflate
CONNECTION:Keep-Alive
ACCEPT-LANGUAGE:en-US
USER-AGENT:Mozilla/5.0 (iPhone; CPU iPhone OS 11_0 like Mac OS X) AppleWebKit/604.1.38 (KHTML, like Gecko) Version/11.0 Mobile/15A372 Safari/604.1
ACCEPT:text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
After long hours of trial-and-error i found out that C was in fact the default setting for the LANG env variable inside Ubuntu itself:
LANG=C.UTF-8
I can in fact impact Google Analytics classification by altering my ENV variables by using the following command from the command line:
export LC_ALL="en_US.UTF-8"
It does not work if i only set "export LC_LANG" or "LANGUAGE". I am not sure why either.
But how do i control this setting from inside PhantomJS / CasperJS? I can't / don't want to have to change my ENV variables for each PhantomJS run from CLI, i test multiple languages at once in big numbers.
Has anyone experienced this and can help?
I managed to find a hack-ish solution to this problem. I simply use the following command fromt he CLI:
$ LC_ALL=en-gb phantomjs script.js
and that passes the Accept-Language correctly to Google Analytics.
I think there's a problem with CasperJS request-headers being correctly passed on to PhantomJS.
UPDATED: Took everyone's advice and decided plone.app.registry and 4.1.1 were not the issue, question is, what is? Where can I find the error logs in binary installer?
symptom: can't add content types (under Add New... folders, pages, news items, etc. -- hangs on save, more specifically my portal_factory is unable to validate and move the content to ZODB).
had same issue using both unified (4.1) and binary (4.1) installers
environment: mac book 10.6 Snow Leopard 32-bit
When I run buildout I see no errors:
2012-05-08 18:13:34 INFO ZServer HTTP server started at Tue May 8 18:13:34 2012
Hostname: 0.0.0.0
Port: 8080
2012-05-08 18:14:01 WARNING ZODB.FileStorage Ignoring index for /Applications/Plone/zinstance/var/filestorage/Data.fs
2012-05-08 18:14:27 INFO Zope Ready to handle requests
When I create a new site in Plone, Terminal says: http://pastie.org/3882025
Line 23: 2012-05-08 18:16:01 INFO GenericSetup.plone.app.registry Cannot find registry
That's not an error - that's what happens whenever you start up an instance with a new Data.fs file. If there's no Data.fs.index, or the .index file is inconsistent with the Data.fs, the existing one is ignored and the index is rebuilt. It means absolutely nothing on a new install.
There must be more information than this in the log.
Fixed this issue by following this post here: http://plone.293351.n2.nabble.com/Add-new-Plone-site-creates-site-with-JS-problems-4-1-4-tt7547774.html#a7555663
Basically, had to go to javascript registry, save, empty cache, restart browser, testing in Chrome only.