I apologize in advance for what I write through a translator, I am very bad at English.
I was faced with the following problem: I need to perform validation css files. To this end, I decided to use the NPM package w3c-css, first it worked, but then start giving "connected etimedout", in the course of research, I noticed that through the browser and the validator stops working.
Sniffer log at start of my script: link (<10 rep :( )
My code:
gulp.task('css', function() {
gulp.src('dev/sass/*.scss')
.pipe(through2.obj(function(file, enc, cb){
w3c_css.validate({text: file.contents.toString('utf8')}, function(err, data) {
if(err) {
// an error happened
console.error(err);
} else {
// validation errors
console.log('validation errors', data.errors);
// validation warnings
console.log('validation warnings', data.warnings);
}
});
cb(null, file);
}))
.pipe(gulp.dest('build/'));
});
What is the reason? It must be some mistake, or I block due to too frequent requests and it does not change? Maybe there is some other way to check the css files?
Thx!
From the "About" page of the CSS validation service of the W3C:
Can I build an application upon this validator? Is there an API?
Yes, and yes. The CSS Validator has a (RESTful) SOAP interface which should make it reasonably easy to build applications (Web or otherwise) upon it. Good manners and respectful usage of shared resources are of course customary: make sure your applications sleep() between calls to the validator, or install and run your own instance of the validator.
So yes, it seems you have been banned.
I don't know how to make a gulp task to be called from time to time. You may mount a local version of the CSS Validator webservice and editing the w3c-css package to point to your own server.
Make sure that your script will sleep for at least 1 second between requests.
From the manual:
Note: If you wish to call the validator programmatically for a batch
of documents, please make sure that your script will sleep for at
least 1 second between requests. The CSS Validation service is a free,
public service for all, your respect is appreciated. thanks.
To validate multiple links, use async + setTimeout or any related way to pause between the requests:
'use strict';
var async = require('async');
var validator = require('w3c-css');
var hrefs = [
'http://google.com',
'https://developer.mozilla.org',
'http://www.microsoft.com/'];
async.eachSeries(hrefs, function(href, next) {
validator.validate(href, function(err, data) {
// { process err, data.errors & data.warnings }
// sleep for 1.5 seconds between the requests
setTimeout(function() { next(err); }, 1500);
});
}, function(err) {
if(err) {
console.log('Failed to process an url', err);
} else {
console.log('All urls have been processed successfully');
}
});
EDIT:
To mitigate this issue:
Added some comments and an example.
Placed setTimeout right into the gulp-w3c-css plugin.
Related
Hello and thank you for your help.
Sadly support over at CF does not think they need to help me.
I am learning to use workers, and have written a simple HTML injector just to see it working on my site.
this is the full worker code i have:
async function handleRequest(req) {
const res = await fetch(req)
const contentType = res.headers.get("Content-Type")
console.log('contentType: ', contentType)
// If the response is HTML, it can be transformed with
// HTMLRewriter -- otherwise, it should pass through
if (contentType.startsWith("text/html")) {
return rewriter.transform(res)
} else {
return res
}
}
class UserElementHandler {
async element(element) {
element.before("<div class='contbox'><img src='https://coverme.co.il/wp-content/uploads/2020/01/covermeLOGO-01-1024x183.png' style='width:200px;margin:20px;'><h1>testing inserting</h1></div>", {html: true});
// fill in user info using response
}
}
const rewriter = new HTMLRewriter()
.on("h1", new UserElementHandler())
addEventListener("fetch", event => {
event.respondWith(handleRequest(event.request))
})
it just uses element.before to inject some HTML.
in the worker preview pane i can see it!
but on the live site = nothing.
this is the active URL: [https://coverme.co.il/product/%D7%A0%D7%A8-%D7%91%D7%99%D7%A0%D7%95%D7%A0%D7%99-tuberosejasmine/]
these are the 4 routes i have set up to try to catch this, with and without encoding the letters:
coverme.co.il/product/נר-בינוני-tuberosejasmine/
*.coverme.co.il/product/נר-בינוני-tuberosejasmine/*
https://coverme.co.il/product/%D7%A0%D7%A8-%D7%91%D7%99%D7%A0%D7%95%D7%A0%D7%99-tuberosejasmine/
*.coverme.co.il/product/%D7%A0%D7%A8-%D7%91%D7%99%D7%A0%D7%95%D7%A0%D7%99-tuberosejasmine/*
thanks in advance!
I believe the problem here is that you've configured your routes to match "נר-בינוני" unescaped, but the browser will actually percent-encode the URL before sending to the server, therefore the route matching actually operates on percent-escaped URLs. So the actual URL is https://coverme.co.il/product/%D7%A0%D7%A8-%D7%91%D7%99%D7%A0%D7%95%D7%A0%D7%99-tuberosejasmine/, and this does not match your route because %D7%A0%D7%A8-%D7%91%D7%99%D7%A0%D7%95%D7%A0%D7%99 is not considered to be the same as נר-בינוני.
EDIT: Unfortunately, using percent-encoding in your route pattern won't fix the problem, due to a known bug. Unfortunately, it's just not possible to match non-ASCII characters in a Workers route today. We intend to fix this, but it's hard because some sites are accidentally dependent on the broken behavior, so the fix would break them.
What you can potentially do instead is match against coverme.co.il/product/*, and then, inside your worker, check if the path also has נר-בינוני-tuberosejasmine. If it does not, then your fetch event handler should simply return without calling event.respondWith(). This will trigger "default handling" of the request, meaning it will pass through and be sent to your origin server like normal. (Note that you will still be billed for a Workers request, though.)
So, something like this:
addEventListener("fetch", event => {
if (event.request.url.includes(
"coverme.co.il/product/נר-בינוני-tuberosejasmine/")) {
event.respondWith(handle(event.request));
} else {
return; // not a match, use default pass-through handling
}
})
When I return the geocode from googles API I'm trying to save it into my database. I've been trying to use the code below, to just insert a Test document with no luck. I think it has something to do with meteor being asynchronous. If I run the insert function before the googleMapsClient.geocode function it works fine. Can someone show me what I'm doing wrong.
Meteor.methods({
'myTestFunction'() {
googleMapsClient.geocode({
address: 'test address'
}, function(err, response) {
if (!err) {
Test.insert({test: 'test name'});
}
});
}
});
I see now where you got the idea to run the NPM library on the client side, but this is not what you really want here. You should be getting some errors on the server side of your meteor instance when you run the initial piece of code you gave us here. The problem is that the google npm library runs in it's own thread, this prevents us from using Meteor's methods. The easiest thing you could do is wrap the function with Meteor.wrapAsync so it would look something like this.
try {
var wrappedGeocode = Meteor.wrapAsync(googleMapsClient.geocode);
var results = wrappedGeocode({ address : "testAddress" });
console.log("results ", results);
Test.insert({ test : results });
} catch (err) {
throw new Meteor.Error('error code', 'error message');
}
You can find more info by looking at this thread, there are others dealing with the same issue as well
You should run the googleMapsClient.geocode() function on the client side, and the Test.insert() function on the server side (via a method). Try this:
Server side
Meteor.methods({
'insertIntoTest'(json) {
Test.insert({results: json.results});
}
});
Client side
googleMapsClient.geocode({
address: 'test address'
}, function(err, response) {
if (!err) {
Meteor.call('insertIntoTest', response.json);
}
});
Meteor Methods should be available on the both the server and client sides. Therefore make sure that your method is accessible by server; via proper importing on /server/main.js or proper folder structuring.
(If a method contains a secret logic run on the server, it should be isolated from the method runs on both server & client, though)
My application is not spiderable both on local and production.
When I go to http://localhost:3000/?_escaped_fragment_=, I can see the following error appears (phantom is killed after 15 seconds):
spiderable: phantomjs failed: { [Error: Command failed: ] killed: true, code: null, signal: 'SIGTERM' }
It seems that many other people got this problem:
https://github.com/gadicc/meteor-phantomjs/issues/1
https://groups.google.com/forum/#!msg/meteor-talk/Lnm9HFs4MgM/YKDMR80fVecJ
https://groups.google.com/forum/#!topic/meteor-talk/7ZbidddRGo4
The thing is I am not using observatory or select2 and all my publications return a cursor. According to me, the problem comes from the minification. I just read in this thread that someone succeed to display "SyntaxError: Parse error". How can I know more about what is going wrong with Phantom and which file is causing the problem?
This happens when spiderable is waiting for subscriptions that fail to return any data and end up timing out, as mentioned in some of the threads you linked.
Make sure that all of your publish functions are either returning a cursor, a (possibly empty) list of cursors, or sending this.ready().
Meteor APM may be useful in determining which publications aren't returning.
If you want to know more about what is wrong with phatomjs, you might try this code (1):
// Put your URL below, no "?_escaped_fragment_=" necessary
var url = "http://your-url.com/";
var page = require('webpage').create();
page.open(url);
setInterval(function() {
var ready = page.evaluate(function () {
if (typeof Meteor !== 'undefined'
&& typeof(Meteor.status) !== 'undefined'
&& Meteor.status().connected) {
Deps.flush();
return DDP._allSubscriptionsReady();
}
return false;
});
if (ready) {
var out = page.content;
out = out.replace(/<script[^>]+>(.|\n|\r)*?<\/script\s*>/ig, '');
out = out.replace('<meta name=\"fragment\" content=\"!\">', '');
console.log(out);
phantom.exit();
}
}, 100);
For use in local, install phantomjs. Then outside your app, create a file phantomtest.js with the code above. And run phantomjs phantomtest.js
Another thing that maybe you can try is to use UglifyJS to catch some errors in the minified JS file as Payner35 did.
My problem was coming from SSL. You can have a complete overview of what I did here.
Edit the spiderable source and add --ignore-ssl-errors=yes to the phantomjs command line, it will work.
i'm working on a simple app based on meteor and MeteorStreams.
The aim is simple :
one user will click on a button to create a room
other users will join the room
those users can emit streams with simple message
the creator will listen to that message and then display them
In fact : message from other users are sent (log in server script), but the creator doesn't receive them.
If i reload the page of the creator, then it will get messages sent from other user.
I don't really understand why it doesn't work the first time.
I use meteor-router for my routing system.
Code can be seen here
https://github.com/Rebolon/MeetingTimeCost/tree/feature/pokerVoteProtection
for the client side code is availabel in client/views/poker/* and client/helpers
for the server stream's code is in server/pokerStreams.js
Application can be tested here : http://meetingtimecost.meteor.com
The creator must be logged.
If you have any idea, any help is welcome.
Thanks
Ok, Ok,
after doing some debugging, i now understand what is wrong in my code :
it's easy in fact. The problem comes from the fact that i forgot to bind the Stream.on event to Deps.autorun.
The result is that this part of code was not managed by reactivity so it was never re-run automatically when the Session changed.
The solution is so easy with Meteor : just wrap this part of code inside the Deps.autorun
Meteor.startup(function () {
Deps.autorun(function funcReloadStreamListeningOnNewRoom () {
PokerStream.on(Session.get('currentRoom') + ':currentRoom:vote', function (vote) {
var voteFound = 0;
// update is now allowed
if (Session.get('pokerVoteStatus') === 'voting') {
voteFound = Vote.find({subscriptionId: this.subscriptionId});
if (!voteFound.count()) {
Vote.insert({value: vote, userId: this.userId, subscriptionId: this.subscriptionId});
} else {
Vote.update({_id: voteFound._id}, {$set: {value: vote}});
}
}
});
});
});
So it was not a Meteor Streams problem, but only my fault.
Hope it will help people to understand that outside Template and Collection, you need to wrap your code inside Deps if you want reactivity.
I've just spent a few hours reading SO with answers such as Meteor: Calling an asynchronous function inside a Meteor.method and returning the result
Unfortunately, I still didn't manage to user fibers, or futures for that matter.
I'm trying to do something fairly simple (I think!).
When creating a user, add a variable to the user object, based on the result of an asynchronous method. So imagine if you will my async method is called on a 3rd party db server called BANK, which could take several seconds to return.
Accounts.onCreateUser(function(options,user){
var Fiber = Npm.require("fibers");
Fiber(function() {
BANK.getBalance(function(err, theBalance) {
if (err) return console.log(err);
_.extend(user,{
balance: theBalance;
});
});
}).run();
return user;
});
So what happens in the above is that the BANK method is called, but by the time it returns the code has already moved on and _.extend is never invoked.
I tried placing the return call inside the Fiber, that only made things worse: it never return user. Well it did, but 3 seconds too late so by then everything downstream was bailing out.
Thank you for any help!
Answering my own question which hopefully will help some people in the future. This is based on the excellent advice of Avital Oliver and David Glasser to have a look at Mike Bannister's meteor-async.md. You can read it here: https://gist.github.com/possibilities/3443021
Accounts.onCreateUser(function(options,user){
_.extend(user,{
balance: getBalance(),
});
return user;
});
function getBalance() {
var Future = Npm.require("fibers/future");
var fut = new Future();
BANK.getBalance(function(err, bal) {
if (err) return console.log(err);
fut.return(bal);
});
return fut.wait();
}
I believe there's an even better way to handle this, which is directly by wrapping the BANK API in Futures within the npm package itself, as per this example (from Avital Oliver): https://github.com/avital/meteor-xml2js-npm-demo/blob/master/xml2js-demo.js
I hope it helps!
Use this.unblock() on server side code.
From Meteor 1.0 documentation: "Allow subsequent method from this client to begin running in a new fiber.On the server, methods from a given client run one at a time. The N+1th invocation from a client won't start until the Nth invocation returns. However, you can change this by calling this.unblock. This will allow the N+1th invocation to start running in a new fiber."
Meteor.methods({checkTwitter: function (userId) {
check(userId, String);
this.unblock();
try {
var result = HTTP.call("GET", "http://api.twitter.com/xyz",
{params: {user: userId}});
return true;
} catch (e) {
// Got a network error, time-out or HTTP error in the 400 or 500 range.
return false;
}
}});
method calls use the sync style (see 'sync call' here http://docs.meteor.com/#meteor_call) on the server side, which is where this create user method runs - you should be able to do something like
Accounts.onCreateUser(function(options, user) {
user.balance = Meteor.call('getBankBalance', params);
return user;
});
Thanks yo so much that's work, This solution its better for Meteor projects, because Fibers module installed by default. mrt add npm has a method for this too -> Meteor.sync . For any nodeJS projects there is a other module based on Fibers, its name is Fibrous
Reference:https://github.com/goodeggs/fibrous