Handle Images using webApi 2 - asp.net

I am using webApi 2 for my application. i have gallery and i need a way to show my images. I've searched a lot and found some articles like these :
http://www.dotnetcurry.com/aspnet/1120/aspnet-webapi-binary-contents-images
https://jamessdixon.wordpress.com/2013/10/01/handling-images-in-webapi/
but they seems they don't have a efficient way.
so I decide to ask this question once again. how can I handle my images in web api 2 ?
what's the best approach?

Without a little more info on your use case, it's hard to know what the best way would be. Typically a formatter would be used. Then your client would make two requests, /api/Images to get all your images, then create the proper views for each image (HTML would be ). A route that accepts an extension is needed and a formatter to return the proper content type and binary data. Your controller could do all the binary streaming, but a formatter allows better flexibility.
As for BSON, I have never used BSON. Our API has so many different clients that JSON is universal for our DTO's. For images, we always use the the response body for the pure binary stream. This way there is no custom parsing that each client has to do. We can rely completely on the HTTP specs by using the proper content-type, content-length, etc. This also allows for a range request so we can stream and do partial downloads with resume. I have no doubt BSON will catch on and we'll probably use it in the future, but for now it doesn't fit our use case.

Related

multipart/mixed support in Netty

By browsing the source code and playing with some toy examples I got to the conclusion that Netty currently (as of 5.0.0 alpha2) supports only multipart/form-data, but not multipart/mixed, at least not as specified in RFC1342 (sec. 7.2). It looks like mixed is supported inside a part in multipart/form-data though.
Is that really the case or am I missing something?
Since I get the very same question, I post here what could be an beginning of answear...
However, the current implementation seems to have 2 limitations:
1) it supports only multipart/form-data. I would like to also be able
to use multipart/mixed, which is very similar on the wire (see
http://www.w3.org/Protocols/rfc1341/7_2_Multipart.html ). I think that
the encoder/decoder could be extended to understand multipart/mixed
and still create the same kinds of HttpDatas.
Yes, the current codec is focused on multipart/form-data. I shall be possible to extend or propose a new one (based on it probably) to enable the support of multipart/mixed.
The current codec was made based on user needs (mine in the beginning, others following). Since no one yet has requested a support for multipart/mixed, it was not coded, except for internal multipart/mixed code.
The reference is RFC1867.
As Netty loves contributions, you are more than welcome to propose yours ;-)
2) it seems that is it only possible to use efficient HttpDatas like
FileUpload if you are in multipart/form-data. I would like to be able
to add a FileUpload to the request, and by this way make the contents
of the file be the body of the request, without making it a multipart
request. I think this could be done by extending the Standard Post
Encoder to understand FileUploads.
This could a bit more complicated since it has to be done without multipart, which holds currently the FileUpload class.
Maybe a good direction could be to switch to ChunkFile or ChunkNioFile and to combine it with "your" HttpCodec or in your "HttpHandler" when doing the body request, in order to pass the content through the ChunkFile.
Hoping this helps you in the right direction...

How to detect and possibly drop/sanitize http request parameters/headers to prevent XSS attacks

Recently, we found that some of our SpringMVC based site's pages, that accept query parameters, are susceptible to XSS attacks. For e.g. a url like http://www.our-site.com/page?s='-(console.log(document.cookie))-'&a=1&fx=326tTDE could result in the injected JS to be executed in the context of the rendered page. These pages are all GET-based, no POST requests are supported.
These parameters are written in the markup in numerous places, so doing an HTML encode (in all these places) would be tedious and require more code changes. In some cases, they are also written to cookies.
Ideally, we would want to detect them as early as possible, say inside a Servlet Filter/Spring Interceptor and then for each request parameter, decide if we want to drop it all together, or sanitize it in some way, before it's available to the rest of the application. We would want this decision to be configurable as well, so that the approach to handle a particular request parameter can be modified over time without significant code change.
Now, since these are request parameters that we want to potentially modify, we would probably have to use an approach similar to the one described here, if we go the Filter way. We would potentially want to sanitize HTTP Request Headers similarly too.
So, what would be the most flexible/minimum overhead way to handle this situation? Would ESAPI be able to both detect and sanitize them, in a configurable way? It's not clear from its API as to what is possible. We would definitely not want to hand-roll regexes to do this. Also, would a Filter be the right place to handle this?
Thanks.

Google Geocoding Recommendation

I am looking into utilizing Google Maps API to do some geocoding. I want to implement client side geocoding, to remove the possibility of request limitation.
I need to do some fairly complex logic on the result set, and I would prefer to do that in C# as it is a ASP.NET MVC application. However part of that logic is possibly makeing subsequent follow up requests and that again would require JavaScript.
So, my first thought is to make a service in my application to pass JSON results to and certain return types to trigger the subsequent request. That seems a little convoluted and want to know from the community if this seems like the best approach and if there are any libraries/third party tools that can help handle this situation.
I've an app that does something similar, with the complexity somewhat decoupled by using standardized events (within this app, not a W3 standard or anything)
Client uses native geolocation, SimpleGeo and Google Loader to guess where the user is and AJAX's that to the server.
Server uses client data, MaxMind, and user preferences to decide where to treat the user as being.
Server response is generic event data (as JSON response) that is converted by a generic AJAX response handler into one or more events triggered against the body element.
Depending on the page, various listeners are bound to the events and or namespaces (see jQuery namespaced events) and they handle the updated location events, e.g., getting different weather data, changing local search results
Some of those listeners in turn trigger other AJAX requests, the responses to those may also carry generic events to triggered...
This way there's no sequential code I have to write, i.e., I can add or remove behaviors (simple or complex) without changing anything else. jQuery Events are all I use, really nothing much to it after you decide how you'll pattern things.
Let me know if that's interesting to you and you want me to expand or clarify a part of it.
You may want to try this API:
http://code.google.com/apis/maps/documentation/geocoding/
It's far more REST like - no Javascript required. May work better with C#
In the end I found the best solution was to do as I stated in my question. Pass the JSON object to controller, do work, then return. Worked pretty well.

J2me Httpconnection,which one is better get or post?

In J2ME ,Which connection type is better?Get or post.Which one is faster?which one uses less bandwidth?and which one is supported by most of the handsets?What are the advantages and disadvantages of both?
Also, see Is there a limit to the length of a GET request? which may be relevant if you plan to abuse GET.
Be aware that network operators (certainly in the UK) have caching schemes in place that may affect your traffic.
If you look at what Opera Mini does, they only use HTTP POST in their HTTP mode.
I think this is a great idea because of the following reasons:
POST's are never cached (according to HTTP spec at least) - this saves you from operator caching etc.
It seems some operators do better with POST's than GET's - this is a feeling I get from what some Nigerian users mention.
Opera has the most installations of any J2ME app in the world most probably, and if they do it, it's probably safer.
No problems with HTTP GET limits on query length.
You can use a more flexible data format if you like that uses less data (no encoding needed on the data as with GET)
I think it's much cleaner, but does require some extra work, e.g. if you are using your HTTP web logs to parse out number of requests per "?type=blah" for example, then you'll have to move that into your site's logic.
If you follow standards get should be used only for data retrieval and post for adding new items. It depends on the server handler implementation which one is faster/slower.

Should I use Request.Params instead of explicitly doing Request.Form?

I have been using Request.Form for all my code. And if I need querystring I hit that explicitly too. It came up in a code review that I should probably use the Params collection instead.
I thought it was a best practice, to hit the appropriate collection directly. I am looking for some reinforcement to one side or the other of the argument.
It is more secure to use Request.Form. This will prevent users from "experimenting" with posted form parameters simply by changing the URL. Using Request.Form doesn't make this secure for "real hackers", but IMHO it's better to use the Form collection.
By using the properties under the request you are narrowing down the your retrieval to the proper collection (which is a good thing for readability and performance). I consider your approach to be a best practice and follow it myself.
I have always used
Request.Form("Param")
or
Request.QueryString("Param")
This is purely down to a syntax which is easier to read. I seriously doubt there is a performance impact.
The only time I use Request.Params instead of Form or Querystring is if I don't know whether the method by which the parameters will be passed in.
To put that in context, in 10 years I have used Request.Params in anger only once :)
Kindness,
D
I think it's better to use the Form and QueryString collections explicitly unless you're explicitly trying to define flexible behavior in your application like in a search form where you might want to have the search parameters definable in a URL or saved in cookies such as pagination preferences.
I would use Request.Form and Request.QueryString explicitly. The reason is that the two are not interchangable. The query string is used for HTTP Get requests, and FORM variables for HTTP post requests.
Get requests are typically applicable where you are requesting data, e.g. do a google search, the search words are in the query string. The post are when you are sending data to the web server for processing or storing. So when I say that the two are not interchangable I mean that you cannot change the page from using a GET to a POST without breaking functionality.
So IMHO, the implementation of the page can quite clearly reflect the fact that you intend it to be called by a GET or a POST request.
/Pete

Resources