What kind of server do I have to use? - asynchronous

I want to code an application quite basic:
A client sends a JSON formated string to a server that asynchronously publish it.
I mean, all the clients will also get the JSON as soon as the server publishes it.
My question is about:
What kind of server to use (I basically only know a bit about web services and servlets)
Where can I host and run the resulting code ? I guess it won't be free, of course.
Sorry if the question seems a bit too wide or something

Python or ruby are well suited for creating such servers
A simple solution is to host an app on Heroku

Related

Streaming to root of domain instead of /LiveApp

sorry if this is a really dumb question, but i'd like users to stream directly to stream.domain.com instead of stream.domain.com/LiveApp
is this an Ant thing or something server level?
thanks
Ant Media Server uses application layer in its structure. You need to send a stream with your Application name. For example: rtmp://stream.domain.com/LiveApp/stream1
But, you may use a reverse proxy for your requirements.

Screen scraping in server side

I am new to screen scraping. When i use proxy server and when i track the HTTP transactions, i am getting my post datas revealed to me. So my doubt/problem here is,
1)Will it get stored in the server side or it will be revealed only to the client side?
2)Do we have an option of encrypting the post data in screen scraping?
3)Is it advisable to use screen scraping for banking applications?
I am using screen scraper tool which i have downloaded it from
http://www.screen-scraper.com/download/choose_version.php. (Enterprise version)
Thanks in advance.
My experience with scraping is that if you aren't doing anything super complex (like logging into a secure website like an online banking website, etc.) then Python has some great libraries that will help you out a lot.
To answer your questions:
1) You may need to be more clear, but this really depends on your server/client architecture.
2) As a matter of fact you do. Urllib and Urllib2 (built-in Python libraries) both have functions that enable you to encrypt data before you make a POST. As far as how secure this encryption is, for most applications, this will suffice.
3) I actually have done scraping on online banking sites! I'm not exactly familiar with that tool, but I would recommend using something a little different than a scraper. Selenium, which is a "web-driver", allows you to simulate the use of a browser, meaning anything that the broswer does in the background in order to validate the session is automatically taken care of. The main problem I ran into while trying to scrape the banking site was the loss of important session data.
Selenium - https://pypi.python.org/pypi/selenium
Other libraries you may find useful are: urllib, urllib2, and Mechanize
I hope I was somewhat helpful!
I've used screen-scraper to scrape banking sites before. It will impact the site just like your browser--if the site uses encryption the connection from screen-scraper to the site will be too.
If you have a client page sending data to screen-scraper, you probably should encrypt that. I generally just make the connection via SSH.
1) What do you mean by server side? Your proxy server or screen-scraper software? Any of them can read/store your information.
2) If you are connecting through HTTPS then your software should warn you about malicious proxy server: https://security.stackexchange.com/questions/8145/does-https-prevent-man-in-the-middle-attacks-by-proxy-server
3) I don't think they have some logger which they can read. But if you are concerned you can try to write your own. There are some APIs which you can read HTML easily with jQuery sintax:
https://pypi.python.org/pypi/pyquery or XPath: http://net.tutsplus.com/tutorials/javascript-ajax/web-scraping-with-node-js/

asp.net utility to allow remote writeline?

I had a simple utility a couple years back that I could run on my local machine, and it would monitor a server without the need for remote debugging, or for anything to be running on the server.
I could add the equivalent of Console.Writeline to the code (I can't remember if it was console, debug, or trace) and could monitor it by running the utility on my desktop machine.
I'm hoping that someone can point me in the right direction.
You can create WCF or Webservice that will check what you need and return you XML or Json.
And client will invoke this service
the standard way is to use Trace.Write method in your code and watch what's going on by using
http://myserver/Trace.axd (it will tell you to enable tracing in web.config)
This allows you to see much more data than just your string - it shows complete context plus it actually times all the traces so you can see how much time different actions took

What about using a microkernel for Node.js + NginX?

Not even sure if it would easily work but for an upcoming project I may need to set up a web sockets only server, it would not have a database, memcache or even serve static files, all it would need to do is work some logic and update other clients.
The server may need to support 1~300000 clients simultaneously so Node.js+NginX makes sense, but maybe not all the other features of a traditional web server (apache for example) are necessary...
Something like Minix sounds like it would work...
This may be exactly what you're looking for:
https://github.com/tmpvar/cluster-socket.io
It allows you to handle large amounts of requests across multiple node processes.
Remember you can always stop into #node.js and ask questions! Make sure to report back with your findings.

Multiple replies from server for one client request

This may be a dumb question - and the title may need to be improved... I think my requirement is pretty simple: I want to send a request for data from a client to a server program, and the server (not the client) should respond with something like "Received your request - working on it". The client then does other work. Then when the server has obtained the data, it should send an asynchronous message (a popup?) saying "I've got your data; click on ... (presumably a URL) to obtain data". I have been assuming that the server could be written in Java and that client is html and JavaScript. I haven't been able to come up with a clean solution - help would be appreciated.
Try to employ "Websocket Method" by using "SuperWebSocket" for server side, and "WebSocket4Net" for client side. It is working perfectly for my current project.
Most of the work invovles the server being asynchronous. To do this you must
Have an ajax call to the server that starts a job and returns a confirmation the job has been started.
A page on the server that will return whether or not any jobs are complete for a user.
Have an ajax widget on your client side that pings that page on teh server every so often to see if any jobs have been completed. And if so make a pop up.
This is the only way unless you use Flex data services.
Are you trying to do this on the HTTP protocol? It sounds like you're talking about a web application here, but it's not clear from the question. If so, then there are a variety of techniques for accomplishing this using AJAX which collectively go under the name "Comet". Depending on exactly what you're trying to accomplish, a number of different implementation, on both the client and server side, may be appropriate.
for pure java i suggest something like jgroups (client+server are java)
for html, you should use ajax - there you have a timer that checks every X seconds
Nowadays you have an alternative technique to use: Websockets. These are used for server->client communication without polling or ajax-style delayed responses.

Resources