So I'm building a web app, and I want to emulate a network failure in browser to see if the client side javascript handles it gracefully. I know I can just disconnect my network connection, but that also disconnects my email, pandora, skype, all things that are marginally vital to my non-productivity. Is there an easy way to kill network communication for just one tab in either of these browser? Or (I'm in linux) can I block a single pid from network communication while still allowing the rest (even if it's the same program) through?
Edit: Shoot, I just realized that I'm working on localhost, and that may not apply for what I'm asking for.
Does menu file -> work without connection works for you? It should be in the firefox menu.
You could always use invalid proxy settings! I recall some plugins that let you easily change proxy profiles so you could even have a profile for "dead proxy" and enable ot whenever you want no Internet.
Turns out there are more sophisticated options: a dedicated site blocker for Chrome. That way you could still use other sites that help your non-productivity while still blocking the desired one!
Related
Sorry if this is a dumb question that's already been asked, but I don't even know what terms to best search for.
I have a situation where a cloud app would deliver a SPA (single page app) to a client web browser. Multiple clients would connect at once and would all work within the same network. An example would be an app a business uses to work together - all within the same physical space (all on the same network).
A concern is that the internet connection could be spotty. I know I can store the client changes locally and then push them all to the server once the connection is restored. The problem, however, is that some of the clients (display systems) will need to show up-to-date data from other clients (mobile input systems). If the internet goes down for a minute or two it would be unacceptable.
My current line of thinking is that the local network would need some kind of "ThinServer" that all the clients would connect to. This ThinServer would then work as a proxy for the main cloud server. If the internet breaks then the ThinServer would take over the job of syncing data. Since all the clients would be full SPAs the only thing moving around would be the data - so the ThinServer would really just need to sync DB info (it probably wouldn't need to host the full SPA - though, that wouldn't be a bad thing).
However, a full dedicated server is obviously a big hurdle for most companies to setup.
So the question is, is there any kind of tech that would allow a web page to act as a web server? Could a business be instructed to go to thinserver.coolapp.com in a browser on any one of their machines? This "webpage" would then say, "All clients in this network should connect to 192.168.1.74:2000" (which would be the IP:port of the machine running this page). All the clients would then connect to this new "server" and that server would act as a data coordinator if the internet ever went down.
In other words, I really don't like the idea of a complicated server setup. A simple URL to start the service would be all that is needed.
I suppose the only option might have to be a binary program that would need to be installed? It's not an ideal solution - but perhaps the only one? If so, are their any programs out there that are single click web servers? I've tried MAMP, LAMP, etc, but all of them are designed for the developer. Any others that are more streamlined?
Thanks for any ideas!
There are a couple of fundamental ways you can approach this. The first is to host a server in a browser as you suggest. Some example projects:
http://www.peer-server.com
https://addons.mozilla.org/en-US/firefox/addon/browser-server/
Another is to use WebRTC peer to peer communication to allow the browsers share information between each other (you could have them all share date or have one act as a 'master' etc deepening not he architecture you wanted). Its likely not going to be that different under the skin, but your application design may be better suited to a more 'peer to peer' model or a more 'client server' one depending on what you need. An example 'peer to peer' project:
https://developer.mozilla.org/en-US/docs/Web/Guide/API/WebRTC/Peer-to-peer_communications_with_WebRTC
I have not used any of the above personally but I would say, from using similar browser extension mechanisms in the past, that you need to check the browser requirements before you decide if they can do what you want. The top one above is Chrome based (I believe) and the second one is Firefox. The peer to peer one contains a list of compatible browser functions, but is effectively Firefox and Chrome based also (see the table in the link). If you are in an environment where you can dictate the browser type and plugins etc then this may be ok for you.
The concept is definitely very interesting (peer to peer web servers) and it is great if you have the time to explore it. However, if you have an immediate business requirement, it might be that a simple on site server based approach may actually be more reliable, support a wider variety of browser and actually be easier to maintain (as the skills required are quite commonly available).
BTW, I should have said - 'WebRTC' is probably a good search term for you, in answer to the first line of your question.
httprelay.io v.s. WebRTC
Pros:
Simple to use
Fast
Supported by all browsers and HTTP clients
Can be used with the not stable network
Opensource and cross-platform
Cons:
Need to run a server instance
No data streaming is supported (yet)
I am developing a browser extension. The extension works on external websites we have no control over.
I would like to be able to test the extension. One of the major problems I'm facing is displaying a website 'as-is' locally.
Is it possible to display a website 'as-is' locally?
I want to be able to serve the website exactly as-is locally for testing. This means I want to simulate the exact same HTTP data, including iframe ads, etc.
Is there an easy way to do this?
More info:
I'd like my system to act as closely to the remote website as possible. I'd like to run command fetch for example which would allow me to go to the site in my browser (without the internet on) and get the exact same thing I would otherwise (including information that is not from a single domain, google ads, etc).
I don't mind using a virtual machine if this helps.
I figured this was quite a useful thing in testing. Especially when I have a bug I need to reliably reproduce in sites that have many random factors (what ads show, etc).
As was already mentioned, caching proxies should do the trick for you (BTW, this is the simplest solution). There are quite a lot of different implementations, so you just need to spend some time selecting a proper one (according to my experience squid is a good solution). Anyway, I would like to highlight two other interesting options:
Option 1: Betamax
Betamax is a tool for mocking external HTTP resources such as web services and REST APIs in your tests. The project was inspired by the VCR library for Ruby. Betamax aims to solve these problems by intercepting HTTP connections initiated by your application and replaying previously recorded responses.
Betamax comes in two flavors. The first is an HTTP and HTTPS proxy that can intercept traffic made in any way that respects Java’s http.proxyHost and http.proxyPort system properties. The second is a simple wrapper for Apache HttpClient.
BTW, Betamax has a very interesting feature for you:
Betamax is a testing tool and not a spec-compliant HTTP proxy. It ignores any and all headers that would normally be used to prevent a proxy caching or storing HTTP traffic.
Option 2: Wireshark and replay proxy
Grab all traffic you are interested in using Wireshark and replay it. This I would say it is not that hard to implement required replaying tool, but you can use available solution called replayproxy
Replayproxy parses HTTP streams from .pcap files
opens a TCP socket on port 3128 and listens as a HTTP proxy using the extracted HTTP responses as a cache while refusing all requests for unknown URLs.
Such approach provide you with the full control and bit-to-bit precise simulation.
I don't know if there is an easy way, but there is a way.
You can set up a local webserver, something like IIS, Apache, or minihttpd.
Then you can grab the website contents using wget. (It has an option for mirroring). And many browsers have an option for "save whole web page" that will grab everything, like images.
Ads will most likely come from remote sites, so you may have to manually edit those lines in the HTML to either not reference the actual ad-servers, or set up a mock ad yourself (like a banner image).
Then you can navigate your browser to http://localhost to visit your local website, assuming port 80 which is the default.
Hope this helps!
I assume you want to serve a remote site that's not under your control. In that case you can use a proxy server and have that server cache every response aggressively. However, this has it's limits. First of all you will have to visit every site you intend to use through this proxy (with a browser for example), second you will not be able to emulate form processing.
Alternatively you could use a spider to download all content of a certain website. Depending on the spider software, it may even be able to download JavaScript-built links. You then can use a webserver to serve that content.
This service http://www.json-gen.com provides mock for html, json and xml via rest. By this way, you can test your frontend separately from backend.
I am using MonoTouch to develop an app which will connect to remote devices on a network. These devices have data which can be access through http queries.
If I provide a valid IP address to a controller the app works perfectly, however it hangs for a long time if the controller is not on the network. For this reason I thought it would be good to use the Reachability.cs class which can be found here:
https://github.com/xamarin/monotouch-samples/blob/master/ReachabilitySample/reachability.cs
Instead of using google.com as the host, I am using the IP address of the controller. I have read that there is a bug with this class which causes it to not like having "http" at the beginning of the URL. Having now tried numerous things to get this working I am out of ideas.
Do anyone have any suggestions? Perhaps I am reinventing the wheel here.
Having now tried numerous things to get this working I am out of ideas.
From your question it's not clear what issue you're having with the Reachability class. Maybe you could edit it and add more details ? e.g. what you tried so far, how it reacts like: never works, throws/crash, inconsistent results ...
Do anyone have any suggestions?
If your main issue is blocking the UI of your application then you could (and should anyway) do your connection and data transfer asynchronously (or a separate thread) and once completed update your UI (from the main thread).
E.g. using WebClient.DownloadDataAsync
I'm coding an extension for a customer, one of the requirements is that the extension also works offline because internet services are not that reliable, my customer's business can't stop but can deal with "stale" data, thats a nice tradeoff I guess.
Therefore, I want to code some kind of distributed cache as an extension to synchronize local data among the N nodes that will be connected running the same application and thus synchronize with the real database, hosted on the internet.
In order to achieve that I imagined that I would need to make a network broadcast and listen to incoming broadcasts, then every node that starts to run my application will broadcast it's IP address and become available as a new node for the distributed cache, failover is very important here.
I googled some possibilities I initially thought but none of them will work, I guess. The first was to do it just with HTTP, the second was to use Google Native Client to write C++ code that could run network code and thus do the broadcast, but it has limitations. Right now I'm thinking to use Java Applets but I don't really know if they have some limitations related to networking or if Chrome Extensions has any limitation with Java Applets.
Any ideas on how to do it? Using some of the stuff I suggested or another approach?
You could create an NPAPI extension, which would not be restricted by Chrome at all.
I have run into a difficult situation.
I do not want to do my development based on an emulator, so I want to be able to have my phone (Android), to connect to my local PC to make sure what I am developing comes out the way I want it to.
Issue #1 - I need to be able to connect to my network, not internet, I can't have my PC internet facing, this limits me from opening my PC to the internet.
Issue #2 - No WiFi allowed at my work, security issues.
Issue #3 - I can't publish this to a internet facing site, since the procedure to get it to one, take a few days each publish and will put my development to a crawl.
What I'm looking for, is a way to get my phone to connect to my local PC, maybe via USB/Bluetooth but have access to my local IIS, does anyone have any idea how to accomplish this?
Any hack you use is going to take your environment away from the "reality" you seek.
I would really encourage you to ask your employer to give you the proper means to do your duties. For instance:
Getting you a cheap VPS on the Internet you can push updates to yourself
Setting up an encrypted AP separated from the rest of the company network.
Those two options are extremely cheap and would let you do what you need.
Can't you just use an Android emulator?
This is a generic problem that most mobile developers face, especially when there are big server interactions and I am afraid, in this case, there are no perfect answers. There are only workarounds.
I can see that you have tried most of the workarounds. I will suggest one more.
If your plan is to test client side code / UI, then mock up your server on any cloud based server ( eg : google app engine / amazon ec2) and get you device to access server code over the cloud.
Let me know if this approach works.