REST URL design - multiple resources in one HTTP call [duplicate] - http

This question already has an answer here:
Closed 10 years ago.
Possible Duplicate:
Rails 3 Custom Route that takes multiple ids as a parameter
From what I understand, a good REST URL for getting a resource would look like this:
/resource/{id}
The problem I have is, that I often need to get a large number of resources at the same time and do not want to make a separate HTTP call for each one of them.
Is there a neat URL design that would cater for that or is this just not suitable for a REST API?

Based on your response, the answer to your question is to create a new resource that contains that single set of information. e.g.
GET /Customer/1212/RecentPurchases
Creating composite urls that have many identifiers in a single url limits the benefits of caches and adds unnecessary complexity to the server and client. When you load a web page that has a bunch of graphics, you don't see
GET /MyPage/image1.jpg;image2.jpg;image3.jpg
It just isn't worth the hassle.

I'd say /resources/foo,bar,baz (separator may vary depending on IDs' nature and your aesthetic preferences, "foo+bar+baz", "foo:bar:baz", etc.). Looks a bit "semantically" neater than foo/bar/baz ("baz of bar of foo"?)
If resource IDs are numeric, maybe, even with a range shortcut like /resources/1,3,5-9,12
Or, if you need to query not exactly on resources with specifical IDs, but on group of resources having specific properties, maybe something like /resources/state=complete/size>1GiB/!active/...

I ahve used in the past something like this.
/resources/a/d/
and that would return between x and Y a list.
something like
<resources>
<resource>a</resource>
<resource>b</resource>
<resource>c</resource>
<resource>d</resource>
</resources>
you could also put more advanced searches into the URL dpending on what resource actuall is.

maybe you could try with
[GET]/purchases/user:123;limit:30;sort_date:DESC

Related

How to automatize image search with 2 different types of queries?

I need to do some data scraping and it would be very useful for me if there was a way to implement an algorithm that downloads a subset of images matching a certain query, but only within a specific 'root' website (for example all images in www.example.com including all subdirectories such as www.example.com/sub1).
I already know that it might be impossible to find all subdirectories in a root website unless they're listed somewhere. Since I do not know all the subdirectories I think i should avoid looping over subdirectories and extracting all images (with an online image extractor for instance).
So in my opinion the easiest thing to do is to let google do most of the work so that it outputs all (or maybe most) of the images that are contained in any subdirectory of the 'root' and then do a query.
The problem is thus divided in 2 parts:
Get all the images from google image search that come from a specific website
Only get the subset of images matching the query. This I guess would possible with some AI recognition (all images that are labeled as animals, or buildings and so on)
I know that this is a very broad question so i do not expect any answers with code.
What I would like to know is:
Do you think it is even possible to do that?
What programs would you suggest using for this purpose (both for the search and the image recognition)
If you think this question belongs more to another stack site let me know, I'm trying my best to be compliant with the rules. Thanks.

Social Network API design for Like, Comment, and other Interactions?

Suppose I have a website where I showcase people's artwork, recipes, and journals (and this list is likely to grow). I want people to be able to comment, like, flag, etc. all of them.
I'm not looking for a database schema. I am using ASP.net Web API but the platform should be irrelevant. I am passing an Authorization Bearer header to identify the user who wants to like, comment on, flag, etc. another user's work.
I currently have two "schemes" but I'm not sure what the advantages/disadvantages of each one is, so I can't make a good decision.
The first looks like this:
POST api/<entity>/{id}/<action>
For example, api/journal/21793/like, or api/recipe/1005/comment. Of course, each action would have the appropriate body that includes info to store to the back end.
Implementing it this way, though, would require us to write entityCount x interactionTypeCount functions (actions), which is very tedious, though the API call is friendly.
The second API scheme looks like this:
POST api/<action>
For example, api/like, api/comment - and yes, there would be no way to depict which object type and which record of the object type in the URL. So, if we were to comment on the recipe mentioned, we would call:
POST api/comment
{"id":"1005", "objectType":"recipe", "comment":"This was excellent!"}
This way, we call only 1 API to comment on anything in our system. So, each of the POST api/ functions would require the id and objectType properties in the body, plus whatever other required data as appropriate for the action. (I'm using JSON as an example).
I see that developers would have to know beforehand what our accepted objectType values are in order to post interactions to the right record. I'm not sure if this is option is a good idea.
This question might get flagged for being off-topic or open to debate, but I'm hoping someone can tell me more pros and cons of each approach indicated above so I can make a better decision. Better yet, offer a solution that is either totally different or combines aspects of the two approaches above.

Converting words to hashtags when posting to Twitter

I currently use LinqToTwitter to send posts to Twitter. I'd like to convert words in the title of the post to hashtags when it gets fired off as tweet so something like - "Firefox is cool" is the blog post and becomes #Firefox is cool http://myshortu.rl/dhsgeh on Twitter.
So far the way i see it is i need a database table with the words i want to convert to hashtags. I'd have to parse out the title and compare the words to those in the db and add on the pound sign. Is the best way to use a db table? Or can I do it with an in memory collection or keep the words in web.config? Thanks....
The decision on whether to use a database or file (such as web.config) might depend on whether you want to write code that allows you to maintain the list. e.g. Add, Modify, Remove. If so, then a DB sounds like the easiest option. If the list is small and doesn't change, then adding a delimited list to web.config would work fine.
Since you're using ASP.NET you can't hold it in a memory variable, but you can hold the list in Cache. This can make for some very fast lookups, rather than multiple file or DB queries.
Just to put this into perspective though, it's tough to recommend a proper design in a forum because there might be details that aren't known. So, it's best to take my answer as something that helps think about what the tradeoffs are, rather than a definitive recommendation on what you should do.

Filtering Data in ASP.NET Web Services

I've been using this site for quite a while, usually being able to sort out my questions by browsing through the questions and following tags. However, I've recently come across a question that is rather hard to lookup amongst the great number of questions asked - a question I hope some of you might be able to share your opinion on.
As my problem is a bit hard to fit into a single line, going in the title, I'll try to give a bit more details on the problem I've encountered. So, as the title says I need to filter, or limit, some of the response data my standard ASP.NET Soap-based Web service returns on invoking various web methods. The web service is used to return data used by other systems (a data repository more or less), where the client today is able to specify a few parameters on how the data should be filtered and in return a full-set of data back.
Well, easy enough I thought, just put additional filtering options on the existing web methods which needs a bit more filtered applied, make adjustments on the server-side and we are all set to go - well, unfortunately it turned out to be a bit more tricky then this.
The problem I am facing is that I'm working on a web service running in a production environment, which needs to be extended in such that additional filters can be applied to existing web method being invoked w/o affecting the calls already being made by other systems used by the customer using their client stubs. This is where I am a bit troubled, since I can't seem to find a "right solution" on extending the current web service.
Today, the filter is send as a custom data structure which holds information on which data should filtered, but I am not sure if I can simply just add more information to this data structure w/o breaking code at the clients? One of my co-workers suggested that I could implement a solution where I would extend the web.config on the server-side to hold a section with details on which data should be excluded (filtered out), but I don't find this to be a viable solution long-sighted - and I don't trust customers with such an option since this is likely to go wrong at some point. So the solution I am looking for is a way that I can apply a "second filter" to the data I am requesting from the client so instead of getting a full-set of data back it should only give a fraction, it implemented in such that the filter can be easily modified and it must not affect the current client calls.
Any suggestions on how I should approach this problem?
Thanks!
Kind regards,
E.
A pretty common practice is to create another instance of the application OR use part of the url to signify the version of the endpoint they are connecting to, perhaps the virtual directory is the date. That way old calls will go to the old API and new calls will come in on the new API.
http://api.example.com/dostuff
vs
http://api.example.com/6-7-2011/dostuff

I want to create an RSS feed that is customizable

I want to create a dropdown of RSS feeds and users can pick and choose the feeds they want and a custom feed would be created. Is this possible using straight up HTML and java script or do I need a server technology. There are 7 separate feeds so the possible combinations are 7! - far too many for me to individually code into if statements and separate feeds. Is there a program that will generate the possible feeds for me automatically after I update one of them? Then I could just upload the updated xml files.
Right. So I set up my xml files, say I have one for birthdays, one for deaths, and one for mid life crises. So that is three xml files with three separate links for rss feeds. Now what I want is for people to be able to check off the ones to which they wish to subscribe rather than hitting each one separately. So I would have a form with three checkboxes and a submit button. I could do this with javascript by having 6 separate xml feeds, one for each possible combination. But if I have 4 feeds then I need to set up 24 feeds, and 5 would be 120 possible feed combinations.
So the question becomes, is there some software or library that will either handle this computation for me and crank out RSS mixes/blends similar to what some RSS mixing software seems to do. The problem with the services and software I have seen is that it provides blending for people subscribing to feeds but not for providers. I can see in my head how easily this could be done programmatically even though it would spit out alot of xml and html/javascript.
I guess another way about it would be for them to sign up for multiple feeds simultaneously but I'm not sure if that can be done.
If I am making no sense I apologize. I have never seen this done so it might not be possible. I am just going to go with the page with a bunch of RSS links.
Thanks for everyones responses. I appreciate it.
Just because there are 7 options doesn't mean you need to write 7! if statements. You only need to check if each one of the options is set, and output something appropriately.
So, yes, you need to do this server side. And it's not at all difficult.
Where are you stuck, specifically? Your question is missing a few details.

Resources