Google Translation API - google-translate

Has anyone used Google translation API ? What is the max length limit for using it?

The limit was 500... now it is 5000 chars.
source

500 characters
source

At the moment, the throttle limit is 100,000 characters per day. Looks like you can apply to have that limit increased/removed.

I've used it to translate Japanese to English.
I don't believe the 500 char limit is true if you use http://code.google.com/p/jquery-translate/, but one thing that is true is you're restricted as to the number of requests you can make within a certain period of time. They also try to detect whether or not you're sending a lot of requests with a similar period, almost like a mini "denial of service" attack.
So when I did this I wrote a client with a random length sleep between requests. I also ran it on a grid so all the requests didn't come from a single IP address.
I had to translate ~2000 Java messages from a resource bundle from Japanese to English. It worked out pretty nicely, as long as the text was single words. Longer phrases with context came out awkwardly.

Please have look at this link it will give the correct answer at the bottom of the page.
https://developers.google.com/translate/v2/faq
What is the maximum number of characters per request?
The maximum size of each text to be translated is 5000 characters, not including any HTML tags.

You can send source strings of up to 5,000 characters, but there are a
few provisos that are sometimes lost.
You can only send the 5,000 characters via the POST method.
If you use GET method, you are limited to 2,000-character length limit on urls. If a url is longer than that, Google's servers will just reject it.
Note: 2,000-character limit including the path and the rest
of the query string as well + you must count uri encoding (for instance every space becomes a %20, every quotation
mark a %22)

The Cloud Translation API is optimized for translating of smaller requests. The recommended maximum length for each request is 5K characters (code points). However, the more characters that you include, the higher the response latency. For Cloud Translation - Advanced, the maximum number of code points for a single request is 30K. Cloud Translation - Basic has a maximum request size of 100K bytes.
https://cloud.google.com/translate/quotas

Related

What is the maximum length of a FCM getToken? [duplicate]

Working with the "new" Firebase Cloud Messaging, I would like to reliably save client device registration_id tokens to the local server database so that the server software can send them push notifications.
What is the smallest size of database field that I should use to save 100% of client registration tokens generated?
I have found two different libraries that use TextField and VarChar(255) but nothing categorically defining the max length. In addition, I would like the server code to do a quick length check when receiving tokens to ensure they "look" right - what would be a good min length and set of characters to check for?
I think this part of FCM is still the same as GCM. Therefore, you should refer to this answer by #TrevorJohns:
The documentation doesn't specify any pattern, therefore any valid string is allowed. The format may change in the future; please do not validate this input against any pattern, as this may cause your app to break if this happens.
As with the "registration_id" field, the upper bound on size is the max size for a cookie, which is 4K (4096 bytes).
Emphasizing on the The format may change in the future part, I would suggest to stay safe and have a beyond the usual max (mentioned above) length. Since the format and length of a registration token may also vary.
For the usual length and characters, you can refer to these two answers the latter being much more definitive:
I hasn't seen any official information about format of GCM registrationId, but I've analyzed our database of such IDs and can make following conclusions:
in most cases length of a registrationID equals 162 symbols, but can be variations to 119 symbols, maybe other lengths too;
it consists only from this chars: [0-9a-zA-Z\-\_]*
every regID contains one or both of "delimiters": - (minus) or _ (underline)
I am now using Firebase Cloud Messaging instead of GCM.
The length of the registration_id I've got is 152.
I've also got ":" at the very beginning each time like what jamesc mentioned (e.g. bk3RNwTe3H0:CI2k_HHwgIpoDKCIZvvDMExUdFQ3P1).
I make the token as varchar(255) which is working for me.
However, the length of registration_id has no relationship with size
of 4k. You are allowed to send whatever size of the data through
network. Usually, cookies are limited to 4096 bytes, which consist of
name, value, expiry date etc.
This is a real fcm token:
c2aK9KHmw8E:APA91bF7MY9bNnvGAXgbHN58lyDxc9KnuXNXwsqUs4uV4GyeF06HM1hMm-etu63S_4C-GnEtHAxJPJJC4H__VcIk90A69qQz65toFejxyncceg0_j5xwoFWvPQ5pzKo69rUnuCl1GSSv
as you can see the length of token is: 152
I don't think the upper limit for a registration ID is 4K. It should be safe to assume that it is much lower than that.
The upper limit for a notification payload is 4KB (link), and the notification payload includes the token (link). Since the payload also needs to include the title, body, and other data too, the registration ID should be small.
That's what I understand from the docs ¯\_(ツ)_/¯
The last tokens I got were 163-chars long. I think it's safe to assume that they will never exceed 255 chars. Some comments in the other answer reported much higher lengths!
Update
So far, in 4 months that I'm running my app, there are over 100k registration IDs, and every single one of them is 163-chars long. It's very possible that Google maintains the ID length stable in order not to crash apps. Hence, I'd suggest
getting a few registration IDs in your local machine
measuring their length and verifying it's constant (or at least it doesn't change significantly)
picking a safe initial value, slightly higher than the ID length
I think it's unlikely for the length to change now, but I'll keep an eye. Please let me know if you noticed IDs of different lengths in your apps!

What is a safe maximum length a segment in a URL path should be?

A lot of people are asking "What is the maximum length a URL can be?" but as far as I can see nobody is asking the question:
What is a safe maximum length a segment in a URL path should be?
I think this question is equally as important.
This question is a general question aimed at supporting as many systems out of the box as possible.
In C#, you can get a list of URL path segments from an incoming request, with security modules installed what is considered the maximum length a segment in a URL path can be for this scenario?
I've read on the following page that URL path segments over 260 characters can cause problems in custom ASP.NET modules:
http://www.paraesthesia.com/archive/2011/08/26/long-url-path-segments-in-asp-net-routing-yield-400.aspx/
In web browsers, you type URL segments regardless of what website you visit, / is a URL path segment which is usually mapped to a homepage. With Internet Explorer, Chrome and Firefox being popular browsers, what is the maximum length of a URL path segment length they support?
I can see from the following resource that the maximum length of a URL path differs for different browsers and the figure is sometimes quite high:
What is apache's maximum url length?
But this is a path and not a path segment.
I'm also aware that when rewriting paths, the underlying file system path length comes into play, and the ball park figure of what I can see supported is around 255 characters in a *nix OS.
Other considerations include the maximum length of a URL path segment in a database table. For instance, in MySQL a varchar column can contain up to 255 characters, but is there a case for this, are people storing segments of paths in tables in MySQL or are people storing full URL's in varchar columns? Could this mean 255 characters is too long for a URL path segment?
Is there any W3C specification on how long a URL path segment can be as I can't spot anything?
I did read the W3C specification on URI's but again I didn't spot anything of use:
http://www.ietf.org/rfc/rfc2396.txt
I'm quite baffled that there is no set standard on what a length a URL path segment should be, so maybe I am missing something?
I'm really looking for as much information as possible on what different systems support, and what is considered a safe length for a URL path segment.
Possibly related of What is the maximum length of a URL in different browsers?
In short
According to the HTTP spec, there is no limit to a URL's length. Keep your URLs under 2048 characters; this will ensure the URLs work in all clients & server configurations. Also, search engines like URLs to remain under approximately 2000 characters.
Chrome has a 2MB limit for URLs, IE8 and 9 have a 2084 character limit. So everything points in keeping your URLs limited to approx. 2000 characters.
Also, from a usability point-of-view, URLs that long are not usable/readable by users.
However, the domain name has a max. length of 255 characters.
So to be on the safe side, the max. length of an URL segment would be around 1745 characters, given that your URL exists out of 1 segment.
There is no such specification limit. There may be implementation limits, but you won't find those in the specifications.
Nit: URIs are defined by the IETF, not the W3C, and the current spec is RFC 3986.
The URL length shouldn't exceed 2K (just common practice). The segment path can be any size.
The domain length shouldn't be more than 255 characters (see RFC3986). Limits exist only for implementation. Segments attached to URLs are limited only in particular cases (the old common use).
Nowadays almost all requests go through one file for the rewrite rule, so segment length does not matter.
After that, the segment is kept in a variable that can be very large, so there's basically no limit.

Elevation service UNKNOWN_ERROR

I'm having difficulty with the Google maps V3 JavaScript elevation service.
According a google groups posting ( https://groups.google.com/forum/#!msg/google-maps-js-api-v3/Z6uh9HwZD_k/G1ur1SJN7fkJ ), it appears that if you use getElevationAlongPath() it compresses and sends the entire path to the Google server as an Ajax GET request and subsamples it on their server. This means that if you have a large number of path segments the encoded URL exceeds the maximum URL length and the request fails with UNKNOWN_ERROR.
Can anyone confirm that this is a URL length issue?
I've tried doing my own subsampling along the path and sending just the points I want elevation data for as a getElevationForLocations() request. This does seem to be an improvement, but I'm still getting some UNKNOWN_ERROR responses. These occur unpredictably. Sometimes a request with 400 points returns successfully. Other requests will fail with only 300 points passed. I'm guessing that this still a problem with URL length (presuming getElevationForLocations() also sends URL-encoded data to Google).
The documentation says that "you may pass any number of multiple coordinates within an array, as long as you don't exceed the service quotas." This doesn't seem to be the case.
Does anyone have any suggestions for a reliable way to get a large number of elevation data points (500?) from a long path?
Thanks,
Colin
After a bit more digging, this seems to be the situation.
The JavaScript API for elevation uses the HTTP elevation service behind the scenes. The HTTP elevation service docs do say that requests are limited to 2048 characters. However, if you're using the HTTP service directly, you build you're own URLs. This means you can check the length before sending. If you use the JavaScript API, the URL is built for you, but the API code doesn't check the URL length before sending.
The call end-point URL and the necessary parameters take up 78 characters leaving 1970 for the encoded points.
This is where it gets messy. The number of characters in an encoded point varies with the size and precision of the lat and lng values. Generally, somewhere between 8 and 12 characters per point. An added complication is that some of the characters used in the path encoding may need URL-encoding - further increasing the number of characters needed per point by an unknown, but potentially significant amount (2 extra characters for each path character in need of URL encoding).
All of these complications mean that its theoretically possible for a call to result too long a URL with just 55 points - very, very unlikely though. A safe limit is probably 150 points (but this may still fail occasionally). 200 should work most of the time. 250 should be about the maximum.
In reality, from a small number of tests:
- 200 worked every time
- 300 usually works
- 400 sometimes works
The discrepancy between the calculation and the tests suggests that the JavaScript API may be doing some further form of compression or I've got something wrong in calcs?
Your suspicions are correct, this is a URL length issue. If you have Chrome's Developer Tools open when you submit the request you'll see an HTTP 414 (Request-URI Too Large) error. The URL is around 3000 characters which is about 1000 too many (2048 is a common max url length).
Internally the Google Maps API converts all those points to what looks like an encoded polyline which helps compress that data down, but it's clearly not enough for this really long path. It might be worth splitting the request up into multiple segments when you know your going to be including more than N points (I'd experiment around with N to see what works).

Questions about maxJsonLength in ASP.NET

Recently, I ran into a problem with my application: the size of the JSON string returned from the server was exceeding the default maxJsonLength. I've done some research and implemented some fixes including a variation of paging. Everything looks great at the moment. However, I still have some questions unanswered.
First of all, the majority of the sources point to this article:
http://geekswithblogs.net/frankw/archive/2008/08/05/how-to-configure-maxjsonlength-in-asp.net-ajax-applications.aspx
1. Why 2,097,152 (2MB)? 2MB is way too much data to be loaded for a web page. (Unless, the user is downloading something, but that's a different story) Even 1MB is too much.
2. Than, the author goes on with an example of maxJsonLength of 500,000. Why this number? Is this just an example of how to set the property? Some sources state that 500,000 is the limit. Well, it's not, because I tested my application with 2,097,152 (2MB, roughly 4 times the 500,000) and it worked.
3. Some other sources state that 4MB is the limit... So, what is the limit? Is there a limit? Does it have something to do with the limit of the response from the server?
4. Finally, I'd like to get a strong suggestion on a length of JSON string being received from the server. Not the number to which maxJsonLength should be set, but the actual length of the JSON string, kind of "what to strive for".
Thank you in advance.
There is no hard and fast rule here. Your Json length is going to depend on your application and what information you are returning to the client.
If you really want a "rule of thumb", it should be as SMALL as possible to communicate the data that you need.
For max values, the true limitation is again going to depend most likely on browser requirements, but I personally would never go with more than 2mb for a Json message simply due to what it would take to send that down.
I understand that the total limit is determined by the lesser of the maxJsonLength that you have mentioned and the HttpRuntimeSection.MaxRequestLength. I am currently testing this and I will get back to you.
Of course, the big issue here is that it is seldom a good idea to return such large amounts of data. Whenever I have a response that starts to exceed about 100KB, I take another look at
my overall design and find ways to serve out smaller chunks as they are needed. Even this 100KB is high for must pure data scenarios, by which I mean textual data, not images or scripts.

What is the optimum limit for URL length? 100, 200+

I have an ASP.Net 3.5 platform and windows 2003 server with all the updates.
There is a limit with .Net that it cannot handle more than 260 characters. Moreover if you look it up on web, you will find that IE 6 fails to work if it is not patched at above 100 charcters.
I want to have the rewrite path module to be supported on maximum number of browsers, so I am looking for an acceptable limit to which I can create verbose URL's.
A Url is path + querystring, and the linked article only talks about limiting the path. Therefore, if you're using asp.net, don't exceed a path of 260 characters. Less than 260 will always work, and asp.net has no troubles with long querystrings.
http://somewhere.com/directory/filename.aspx?id=1234
^^^^^^^- querystring
^^^^^^^^^^^^^^^^^^^^^^^^ -------- path
Typically the issue is with the browser. Long ago I did tests and recall that many browsers support 4k url's, except for IE which limits it to 2083, so for all practical purposes, limit it to 2083. I don't know if IE7 and 8 have the limitation, but if you're going to broad compatibility, you need to go for the lowest common denominator.
There is no length limit specified by the W3C, but look here for practical limits
http://www.boutell.com/newfaq/misc/urllength.html
pick your own limit from that.
The default limit in IIS is 16,384 characters
But IE doesn't support more than 2083
More info at link
This article gives the limits imposed by various browsers. It seems that IE limits the URL to 2083 chars, so you should probably stay under that if any of your users are on IE.
Define "optimum" for your application.
The HTTP standard has a limit (it depends on your application):
The HTTP protocol does not place any
a priori limit on the length of a URI.
Servers MUST be able to handle the URI
of any resource they serve, and SHOULD
be able to handle URIs of unbounded
length if they provide GET-based forms
that could generate such URIs. A
server SHOULD return 414 (Request-URI
Too Long) status if a URI is longer
than the server can handle (see
section 10.4.15).
Note: Servers ought to be cautious about depending on URI
lengths above 255 bytes, because some older client or proxy
implementations might not properly support these lengths.
So the question is - what is the limit of your program, or what is the maximum resource identifier size your program needs to perform all its functionality?
Your program should have a natural limit.
If it doesn't you might as well stick it as 16k, as you don't have enough information to define the problem.
-Adam
Short ;-)
The problem is that every web server and every browser has own ideas how long the maximum is. The RFC for the HTTP protocol gives no maximum length. IE limits the get to 2083 characters, the path itself may be at most 2,048 characters. However, this limit is not universal. Firefox claims to support at least up to 65,536, however some people verified that on some platforms even 100,000 characters work. Safari is above 80,000 (tested). Apache server on the other hand has a limit of 4,000. Microsofts Internet Information Server has one being 16,384 (but it is configurable).
My recommendation is to stay below 2'000 characters in any case. This is not guaranteed to work with every browser in the world (especially not older ones), but it will work with all modern browsers. Further I recommend to use POST wherever possible (e.g. avoid using GET for FORM submits - if some users want to simulate a FORM submit via GET, make sure your application supports the desired parameters either via POST or via GET, but when you submit the page yourself via a button or JS, prefer POST over GET).
I think the RFC says 4096 chars but IE truncates down to 2083 characters. Stay well under that to be safe.
Practically, shorter URLs are friendlier.
More information is needed but for normal situations I would say try to keep it under 150 for sure. If for nothing else than pure ascetics, I hate when someone sends me a GI-NORMOUS link...
Are you passing values through the query string? I assume that is why you asked, correct?
What is "optimum" anyway?
GET requests can be several kB in length, so this is entirely subjective.
I'd say - stay within the address bar length of a maximized 1024x768 window to be user friendly.
If you're trying to get people to remember the URL, I wouldn't go more than 60. Use words if possible, because it's easier to remember "www.example.com/this-is-the-url" than "www.example.com/179264". If you're trying to get the page indexed, you could probably go more. The spiders look for words in the title too, and some people may be more likely to click on the link if the URL looks readable.
When you say "Optimum", I think "Easily Accessible To Users", in which case, I think the shorter the URL, the better. I would think 20-30 characters maximum, in that case.

Resources