I'm trying to work out the best way to get Redux Offline to debounce server requests.
Currently when the server is busy, it saves them up in a queue and sends all of them. I'd like it just to save the last one but use the same Save action to update the redux store.
My code in the action is:
export class SaveSession extends OfflineAction<SessionTypes> {
public readonly type = SessionTypes.SAVE_SESSION;
constructor(public session: ISession) {
super();
this.meta = {
offline: {
effect: {
url: `${process.env.REACT_APP_API}/api/session/save`,
body: JSON.stringify(session),
method: 'POST',
headers: authHeader()
},
commit: new SaveSessionSuccess(),
rollback: new SaveSessionError()
}
};
}
}
I'm looking through the documentation but I can't see anything around debouncing server requests.
Is this possible?
Have a look at this link from the documentation.
It seems like you will be interested in overriding the enqueue function to adjust the logic to suit your need: some kind of smart queue.
They said they've come up with smart-queue to serve this purpose: check this. Check it out and roll out the mechanism to match your requirement.
Related
Is it possible to stub meteor methods and publications in cypress tests?
In the docs (https://docs.cypress.io/guides/getting-started/testing-your-app#Stubbing-the-server) it says:
This means that instead of resetting the database, or seeding it with
the state we want, you can force the server to respond with whatever
you want it to. (...) and even test all of the edge cases, without needing a server.
But I do not find more details about that. All I can find is, that when not using the virtual user in the tests to fill the database, it is possible to call API calls on the app, like so:
cy.request('POST', '/test/seed/user', { username: 'jane.lane' })
.its('body')
.as('currentUser')
In my opinion that is not a "stub". It is a method to "seed" the database.
Is it possible to tell cypress to answer a meteor method in the client code like
Meteor.call('getUser', { username: 'jane.lane' }, callbackFunction);
with some data, it would give back in production?
I can only show an example using sinon to stub Meteor method calls:
const stub = sinon.stub(Meteor, 'call')
stub.callsFake(function (name, obj, callback) {
if (name === 'getUser' && obj.username === 'jane.lane') {
setTimeout(function () {
callback(/* your fake data here */)
})
}
})
That would be of corse a client-side stub. You could also simply override your Meteor method for this one test.
I have an app where users can create posts. There is no login or user account needed! They submit content with a form as post request. The post request refers to my api endpoint. I also have some other api points which are fetching data.
My goal is to protect the api endpoints completely except some specific sites who are allowed to request the api ( I want to accomplish this by having domain name and a secure string in my database which will be asked for if its valid or not if you call the api). This seems good for me. But I also need to make sure that my own application is still able to call the api endpoints. And there is my big problem. I have no idea how to implement this and I didn't find anything good.
So the api endpoints should only be accessible for:
Next.js Application itself if somebody does the posting for example
some other selected domains which are getting credentials which are saved in my database.
Hopefully somebody has an idea.
I thought to maybe accomplish it by using env vars, read them in getinitalprops and reuse it in my post request (on the client side it can't be read) and on my api endpoint its readable again. Sadly it doesn't work as expected so I hope you have a smart idea/code example how to get this working without using any account/login strategy because in my case its not needed.
index.js
import Head from 'next/head'
import Image from 'next/image'
import styles from '../styles/Home.module.css'
export default function Home(props) {
async function post() {
console.log(process.env.MYSECRET)
const response = await fetch('/api/hello', {
method: 'POST',
body: JSON.stringify(process.env.MYSECRET),
})
if (!response.ok) {
console.log(response.statusText)
}
console.log(JSON.stringify(response))
return await response.json().then(s => {
console.log(s)
})
}
return (
<div className={styles.container}>
<button onClick={post}>Press me</button>
</div>
)
}
export async function getStaticProps(context) {
const myvar = process.env.MYSECRET
return {
props: { myvar },
}
}
api
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
export default function handler(req, res) {
const mysecret = req.body
res.status(200).json({ name: mysecret })
}
From what I understand, you want to create an API without user authentication and protect it from requests that are not coming from your client application.
First of all, I prefer to warn you, unless you only authorize requests coming from certain IPs (be careful with IP Spoofing methods which could bypass this protection), this will not be possible. If you set up an API key that is shared by all clients, reverse engineering or sniffing HTTP requests will retrieve that key and impersonate your application.
To my knowledge, there is no way to counter this apart from setting up a user authentication system.
I'm playing around with HttpServer; and was adding support for serving static files (I'm aware of Shelf; I'm doing this as a learning exercise). I have a list of handlers that are given the opportunity to handle the request in sequence (stopping at the first that handles it):
const handlers = const [
handleStaticRequest
];
handleRequest(HttpRequest request) {
// Run through all handlers; and if none handle the request, 404
if (!handlers.any((h) => h(request))) {
request.response.statusCode = HttpStatus.NOT_FOUND;
request.response.headers.contentType = new ContentType("text", "html");
request.response.write('<h1>404 File Not Found</h1>');
request.response.close();
}
}
However, as I implemented the static file handler, I realised that I couldn't return true/false directly (which is required by the handleRequest code above, to signal if the request is handled) unless I use file.existsSync().
In something like ASP.NET, I wouldn't think twice about a blocking call in a request because it's threaded; however in Dart, it seems like it would be a bottleneck if every request is blocking every other request for the duration of IO hits like this.
So, I decided to have a look in Shelf, to see how that handled this; but disappointingly, that appears to do the same (in fact, it does several synchronous filesystem hits).
Am I overestimating the impact of this; or is this a bad idea for a Dart web service? I'm not writing Facebook; but I'd still like to learn to write things in the most efficient way.
If this is considered bad; is there a built-in way of doing "execute these futures sequentially until the first one returns a match for this condition"? I can see Future.forEach but that doesn't have the ability to bail. I guess "Future.any" is probably what it'd be called if it existed (but that doesn't)?
Using Shelf is the right approach here.
But there is still a trade-off between sync and async within the static handler package.
Blocking on I/O obviously limits concurrency, but there is a non-zero cost to injecting Future into a code path.
I will dig in a bit to get a better answer here.
After doing some investigation, it does not seem that adding async I/O in the shelf_static improves performance except for the bit that's already async: reading file contents.
return new Response.ok(file.openRead(), headers: headers);
The actual reading of file contents is done by passing a Stream to the response. This ensures that the bulk of the slow I/O happens in a non-blocking way. This is key.
In the mean time, you may want to look at Future.forEach for an easy way to invoke an arbitrary number of async methods.
There are a lot of good questions in your post (perhaps we should split them out into individual SO questions?).
To answer the post title's question, the best practice for servers is to use the async methods.
For command-line utilities and simple scripts, the sync methods are perfectly fine.
I think it becomes a problem if you do file access that is blocking for a long time (reading/writing/searching big files locally or over the network).
I can't imagine file.existsSync() doing much damage. If you are already in async code it's easy to stay async but if you have to go async just for the sake of not using file.existsSync() I would consider this premature optimization.
A little offtopick, but it solved my problem, I was trying to solve by reading discussion on this question. I was not able to achieve async operation in handler with io.serve, so I used dart:io for active pages and shelf.handleReguest for static files:
import 'dart:io';
import 'dart:async' show runZoned;
import 'package:path/path.dart' show join, dirname;
import 'package:shelf/shelf_io.dart' as io;
import 'package:shelf_static/shelf_static.dart';
import 'dart:async';
import 'package:sqljocky/sqljocky.dart';
void main(){
HttpServer
.bind(InternetAddress.ANY_IP_V4, 9999)
.then((server) {
server.listen((HttpRequest request) {
String path = request.requestedUri.path;
if(path == "/db"){
var pool = new ConnectionPool(host: 'localhost', port: 3306, user: 'root', db: 'db', max: 5);
var result = pool.query("select * from myTable");
result.then((Results data) {
data.first.then((Row row) {
request.response.write(row.toString());
request.response.close();
});
});
}else{
String pathToBuild = join(dirname(Platform.script.toFilePath()), '..', 'build/web');
var handler = createStaticHandler(pathToBuild, defaultDocument: 'index.html');
io.handleRequest(request, handler);
}
});
});
}
Many months later I've found how to create that Stream... (still offtopick .. a little)
shelf.Response _echoRequest(shelf.Request request) {
StreamController controller = new StreamController();
Stream<List<int>> out = controller.stream;
new Future.delayed(const Duration(seconds:1)).then((_){
controller.add(const Utf8Codec().encode("hello"));
controller.close();
});
return new shelf.Response.ok(out);
}
Using Signalr (1.0.0-alpha2), I want to know if it is possible to add client functions after a connection has been started.
Say I create my connection and grab the proxy. Then I add some Server Fired client functions to the hub to do a few things. Then I start my connection. I then want to add some more Server Fired functions to my hub object. Is this possible?
var myHub= $.connection.myHub;
myHub.SomeClientFunction = function() {
alert("serverside called 'Clients.SomeClientFunction()'");
};
$.connection.hub.start()
.done(function() {
myHub.SomeNewClientFunction = function() {
alert("serverside called 'Clients.SomeNewClientFunction()'");
}
})
This example is not realistic, but I basically want to send my 'myHub' variable to a different object after the hub is started to subscribe to new events that the original code did not care for.
Real Life Example: A dashboard with a number of different hub events (new site visits, chat message, site error). I 'subscribe' after the connection has started and then pass my hub proxy to all of my different UI components to handle their specific 'message types'. Should I create separate Hubs for these or should I be able to add more Server Fired client functions on the fly?
Yes you can. Use the .on method.
Example:
myHub.on('somethingNew', function() {
alert("This was called after the connection started!");
});
If you want to remove it later on use the .off method.
I have the exact same situation. You might want to consider adding another layout of abstraction if you're trying to call it from multiple places.
Here's a preliminary version of what I've come up with (typescript).
I'll start with the usage. SignalRManager is my 'manager' class that abstracts my debuggingHub hub. I have a client method fooChanged that is triggered on the server.
Somewhere in the module that is using SignalR I just call the start method, which is not re-started if already started.
// ensure signalR is started
SignalRManager.start().done(() =>
{
$.connection.debuggingHub.server.init();
});
Your 'module' simply registers its callback through the manager class and whenever the SignalR client method is triggered your handler is called.
// handler for foo changed
SignalRManager.onFooChanged((guid: string) =>
{
if (this.currentSession().guid == guid)
{
alert('changed');
}
});
This is a simple version of SignalRManager that uses jQuery $.Callbacks to pass on the request to as many modules as you have. Of course you could use any mechanism you wanted, but this seems to be the simplest.
module RR
{
export class SignalRManager
{
// the original promise returned when calling hub.Start
static _start: JQueryPromise<any>;
private static _fooChangedCallback = $.Callbacks();
// add callback for 'fooChanged' callback
static onfooChanged(callback: (guid: string) => any)
{
SignalRManager._fooChangedCallback.add(callback);
}
static start(): JQueryPromise<any>
{
if (!SignalRManager._start)
{
// callback for fooChanged
$.connection.debuggingHub.client.fooChanged = (guid: string) =>
{
console.log('foo Changed ' + guid);
SignalRManager._fooChangedCallback.fire.apply(arguments);
};
// start hub and save the promise returned
SignalRManager._start = $.connection.hub.start().done(() =>
{
console.log('Signal R initialized');
});
}
return SignalRManager._start;
}
}
}
Note: there may be extra work involved to handle disconnections or connections lost.
I am building a registration form (passport-local as authentication, forms as form helper).
Because the registration only knows GET and POST I would like to do the whole handling in one function.
With other words I am searching after something like:
exports.register = function(req, res){
if (req.isPost) {
// do form handling
}
res.render('user/registration.html.swig', { form: form.toHTML() });
};
The answer was quite easy
exports.register = function(req, res) {
if (req.method == "POST") {
// do form handling
}
res.render('user/registration.html.swig', { form: form.toHTML() });
};
But I searched a long time for this approach in the express guide.
Finally the node documentation has such detailed information:
http://nodejs.org/api/http.html#http_http_request_options_callback
Now you can use a package in npm => "method-override", which provides a middle-ware layer that overrides the "req.method" property.
Basically your client can send a POST request with a modified "req.method", something like /registration/passportID?_method=PUT.
The
?_method=XXXXX
portion is for the middle-ware to identify that this is an undercover PUT request.
The flow is that the client sends a POST req with data to your server side, and the middle-ware translates the req and run the corresponding "app.put..." route.
I think this is a way of compromise. For more info: method-override