Approach to create a web server for a game with web browser client and possibility for client software? - client-side

Basically the reason for asking this "question" is to get some information available that at least I didn't find from stackoverflow or didn't actually find by google searches.
So the product would be an multiplayer game, that would be played with browser and also be capable of being played with client software that could be programmed in C#, C++ or any other capable language. The language of the "desktop" client is coded should not be what were are discussing here about.
So the this means that we should be able to separate the presentation code, networking code and actual underlying game logic so that the server handles the game logic while clients present data from server regardless if they are the website version or the desktop client.
After all because this is a multiplayer game we have to take in account the number of connections.
The amount of connections would be more or less around 8 to 12 per "game rooms" and server could be limited to handle up to what it can handle so no problems about that either.
Here is something that I thought but I'm not sure if this is even to right direction at all.
The web interface could be coded with javascript and make use of ajax
or ajax-like technologies.
On the server-side php could be probably used and that would also
allow us to make us of sockets which will allow the client software to
connect to the server.
However can you silently update php generated pages with javascript
without having to have the actual page changing behavior?
I'm concerned about the interface flickering more or less when navigating
through different pages and I really aren't that familiar with
different "ajax technologies".
So I wish to hear and learn about different approaches to this kind of program building and I'm sure that this page could become good resource for other people struggling with this kind of issues.

Your approach is ont he correct path. The first and only webpage should serve the entire game completely in javascript. There is no concept of other 'pages' you have to navigate to. You'd do everything (browser dom manipulation, game logic, transitions, input/output, etc) in javascript code.
The game state which can be lost you'd maintain on the client, and persistent state you'd maintain on the server. You would indeed access the server through Ajax calls from within the javascript game. On the server side you would expose API's which expect some kind of parameters, and return JSON data with the results back to the javascript code.
Now on the desktop you would create the game exactly the same, except that the language you use is not javascript but for instance C# or java or C++. The gamestate you would still store on the server, and you would access it with webcalls from within the desktop game.

Related

Why do we still use HTTP instead of WebSockets for building Web Applications?

Recently I dived into the topic of WebSockets and built a small application that utilizes them.
Now I wonder why HTTP based API'S are still being used, or rather, why they are still being proposed.
As far as I can see there is nothing I can't do with WS that would be possible via HTTP, but the other way round I gain a lot of improvements.
What would be a real world example of an application that takes more benefits from a HTTP powered backend than from a WS one?
#Julian Reschke made good points. The web is document based, if you want your application to play in the WWW ... it have to comply with the game rules.
Still, you can create WS based SPA applications that comply with those.
Using the HTML5 history API, you can change the URL in shown by the browser without causing navigation. That allows you to have a different URL in your address bar depending on the state of your app, enabling then bookmarking and page history. The plugin "ui-router" for AngularJS plays very well here, changing the URL if you change the state programmatically, and viceversa.
You can make your SPA crawlable.
But still you want to use HTTP for some other things, like getting resources or views and cache them using HTTP cache mechanisms. For example, if you have a big application you want some big views to be downloaded on demand, rather than pack everything in a big main view.
It would be a pain to implement your own caching mechanism for HTML to get views and cache them in the local storage for example. Also, by using traditional HTTP requests, those views can be cached in CDNs and other proxy caches.
Websockets are great to maintain "connected" semantics, send data with little latency and get pushed data from the server at any time. But traditional HTTP request is still better for operations that can benefit from distribution mechanisms, like caching, CDN and load balancing.
About REST API vs WebSocket API (I think your question was actually about this), it is more a convenience than a preference. If your API has a big call rate per connection... a websocket probably makes more sense. If your API gets a low call rate, there is no point in using WebSocket. Remember that a Websocket connection, although it is lightweight, it means that something in the server is being held (ie: connection state), and it may be a waste of resources if the request rate do not justify it.
Bookmarking? Page history? Caching? Visibility to search engines?
HTTP and WebSockets are two Web tools originated to accomplish different tasks.
With HTTP you typically implement the request/response paradigm.
With WebSockets you typically implement an asynchronous real-time messaging paradigm.
There are several applications where you need both the paradigms.
You can also try to use WebSockets for request/response and use HTTP for asynchronous real-time messaging paradigm. While the former makes little sense, the latter is a widespread technique, necessary in all the cases where WebSockets are not working (due to network intermediaries, lack of client support, etc.). If you are interested in thus topic, check out this other answer of mine, which tries to clarify the terminology related to these techniques: Is Comet obsolete now with Server-Sent Events and WebSocket?

What is a Webhook and why should I care?

Best I could find was this wiki entry
I I thought "surely there must be more to it than this".
Am I missing something?
From the doc:
What is WebHook?
The concept of a WebHook is simple. A WebHook is an HTTP callback: an
HTTP POST that occurs when something happens; a simple event-notification via HTTP POST.
A web application implementing WebHooks will POST a message to a URL
when certain things happen. When a web application enables users to
register their own URLs, the users can then extend, customize, and
integrate that application with their own custom extensions or even
with other applications around the web. For the user, WebHooks are a
way to receive valuable information when it happens, rather than
continually polling for that data and receiving nothing valuable most
of the time. WebHooks have enormous potential and are limited only by
your imagination! (No, it can't wash the dishes. Yet.)
Why should I care?
As integrated as we perceive the web, most web applications today
operate in silos. With the rise of API's we've seen mashups and some
degree of integration between applications. However, we have not seen
the vision of the programmable web: a web where you as the user can
"pipe" data between apps much like the Unix command line. Some say RSS
is the answer. They are wrong. The heart is in the right place, but
the implementation is wrong. RSS is still useful, but it is not going
to bring us the true programmable web.
We just need a simple way to get data out in real-time to let the user easily do whatever >they wantwith it. That means no polling, no content constraints, and no XML
parsing. That means no RSS. Using HTTP is simpler and easier to use.
PHP is a very popular and accessible programming environment, so it's
likely to be used often for writing hooklets... getting data from a
web POST in PHP is as simple as $_POST['something']. And making the
request to the user script is as simple as making an HTTP request,
something already built-in to most programming environments. In fact,
web hooks are easier to implement than an API.

Check if anyone is currently using an ASP.Net app (site)

I build ASP.NET websites (hosted under IIS 6 usually, often with SQL Server backends and forms authentication).
Clients sometimes ask if I can check whether there are people currently browsing (and/or whether there are users currently logged in to) their website at a given moment, usually so the can safely do a deployment (they want a hotfix, for example).
I know the web is basically stateless so I can't be sure whether someone has closed the browser window, but I imagine there'd be some count of not-yet-timed-out sessions or something, and surely logged-in-users...
Is there a standard and/or easy way to check this?
Jakob's answer is correct but does rely on installing and configuring the Membership features.
A crude but simple way of tracking users online would be to store a counter in the Application object. This counter could be incremented/decremented upon their sessions starting and ending. There's an example of this on the MSDN website:
Session-State Events (MSDN Library)
Because the default Session Timeout is 20 minutes the accuracy of this method isn't guaranteed (but then that applies to any web application due to the stateless and disconnected nature of HTTP).
I know this is a pretty old question, but I figured I'd chime in. Why not use Google Analytics and view their real time dashboard? It will require minor code modifications (i.e. a single script import) and will do everything you're looking for...
You may be looking for the Membership.GetNumberOfUsersOnline method, although I'm not sure how reliable it is.
Sessions, suggested by other users, are a basic way of doing things, but are not too reliable. They can also work well in some circumstances, but not in others.
For example, if users are downloading large files or watching videos or listening to the podcasts, they may stay on the same page for hours (unless the requests to the binary data are tracked by ASP.NET too), but are still using your website.
Thus, my suggestion is to use the server logs to detect if the website is currently used by many people. It gives you the ability to:
See what sort of requests are done. It's quite easy to detect humans and crawlers, and with some experience, it's also possible to see if the human is currently doing something critical (such as writing a comment on a website, editing a document, or typing her credit card number and ordering something) or not (such as browsing).
See who is doing those requests. For example, if Google is crawling your website, it is a very bad idea to go offline, unless the search rating doesn't matter for you. On the other hand, if a bot is trying for two hours to crack your website by doing requests to different pages, you can go offline for sure.
Note: if a website has some critical areas (for example, writing this long answer, I would be angry if Stack Overflow goes offline in a few seconds just before I submit my answer), you can also send regular AJAX requests to the server while the user stays on the page. Of course, you must be careful when implementing such feature, and take in account that it will increase the bandwidth used, and will not work if the user has JavaScript disabled).
You can run command netstat and see how many active connection exist to your website ports.
Default port for http is *:80.
Default port for https is *:443.

confused about webservice AJAX calls security in ASP.NET

excuse me if i will sound little stupid but this thing had confused me to the core and i have been searching like crazy on net with no ultimate answer so i hope some one would shed more light on this matter.
now i wanna create a portal site and my client require that everything should be AJAX'ed so i have been playing with ASP.NET AJAX 4 and client site templating and web service, and of course the performance is great with JASON results, but my web Service code will be Public because anything available to JAVA script is available to anyone so as i read to avoid this i must :
use SSL but this is a portal site and front end should not use SSL
Authentication, will this is fine but for back-end and not front-end as login is not required.
after reading a lot as i have mentioned, i have come to the following pitfalls when using web service with AJAX and hope there is a solution or at least a way to bring more security
DOS:
i have read some articles that suggest you should throttle using IP detection and block this request for a while but here are some of the things i am worried about
will it affect search engine crawlers ?
will the hacker be able to bypass this by using proxy or other mean ?
Session HighJacking:
this is scary i still don't know how this can happen when you are using ASP.NET membership, i thought it is a pretty solid membership system!
and will a hacker be able to steal someones pass through this method?
a way to hide your code or encrypt it:
i think i am making a fool of myself here because i have mentioned that if it is public to java script then it is public to anyone, but my client would not want people to see the code writing logic and function.
Hide Webserivce:
like if you use fiddler you can see in the RAW data the path to for example www.mysite.com/toparticles/getTopArticles(10) again this scares my client and i have tried to disable WSDL and documentation in webconfig but this only blocks direct access to the file and nothing more or am i wrong! is there a way to hide the path to web service?
so all in all my top concerns is to prevent hammering any of my web services and hide my code as much as possible.
so am i to paranoid as on the front end i am going mostly to be pulling Data but again i may give user the option to save for example his widget preferences in DB, etc... and it is not gonna be through SSL thus the above security threats are valid.
i hope some one can also share his experience on this matter,
thanks in advance.
Any functionality exposed on the web is going to be, well... exposed on the web. Even if you were using pure ASP.NET with postbacks, sniffers can see the traffic and mimic the postbacks, Ajax just takes that to its logical extreme. Webservices are (for the most part) just like any other get/post system (RESTful or not).
There are some methods that you can use to secure your webservices from unauthorized access, but in truth these are the exact same things you would do to secure any other site (asp.net, traditional web, etc).
There are lots of articles all over the web about how to secure your website, and these will apply equally well to AJAX, Webservices, etc.
If you are really concerned about your webservices being publicly exposed, you can use your own custom reverse proxy to hide the services inside the customers network and only expose the proxy to the outside world. You can then secure the services so they are only accessed through the proxy and provide whatever appropriate security on the proxy you feel relevant. In this way all traffic comes through servers that you specify and is restricted (to a reasonable degree) from prying eyes. In general though I think this might be over-kill especially for a portal site.
One thing to talk with your client about is the upsell value of using webservices as a sellable product to integrators. In other words, designing the security into the webservices and using the portal only as an example of how to put it all together. A clear example of this is SharePoint, which is in truth a collection of webservices and processes and the website is really just for convenience, the power of SharePoint is in the interop of the services.
For more specific answers to your security questions, there are plenty of posts here on SO as well as the web covering each of your specific points.

is Silverlight more friendly to load-balancing than ASP.NET?

I was discussing load-balancing with a colleague at lunch. I admit that I know very little about this topic. We were discussing the various ways of maintaing session in a ASP.NET application -- none of which suited the high performance load balancing that he was looking for.
What about Silverlight? says I. As far as I know it is stateless, you've got the app running in the browser and you've got services on the server that feed/process data.
Does Silverlight totally negate the need for Session state management? Is it more friendly to load-balancing? Is it something in between?
I would say that Silverlight is likely to be a little more load-balancer friendly than ASP.NET. You have much more sophisticated mechanisms for maintaining state (such as isolated local storage), and pretty much, you only need to talk to the server when (a) you initially download the application, and then (b) when you make a web service call to retrieve or update data. It's analogous in this sense to an Ajax application written entirely in C#.
To put it another way, so long as either (a) your server-side persistence layer knows who your client is, or (b) you pass in all relevant data on each WCF call, it doesn't matter which web server instance the call goes to. You don't have to muck about with firewall-level persistence to make sure your HTTP call goes back to the right web server.
I'd say it depends on your application. If it's a banking application,then yes I want something timingout out after 5 minutes and asking for my password again. If it's facebook then not so much.
Silverlight depends on XMLHttpRequest like any other ajax impelementation and is therefore capable of maintaining a session, forms authentiction, roles, profiles etc etc.
The benefit you are getting is obviating virtually all of the traffic. json requests are negligable compared to serving pages. Even the .xap can be cached on the client.
I would say you are getting the best of both worlds in regards to your question.

Resources