Where I can I find a public URL that returns a dataset of approx 3000 rows in JSON format for testing - jsbin

I need to put together a jsBin example that demonstrates a problem I'm having with some UI controls, which doesn't manifest itself with a only a few records. I need a dataset of about 3000-5000 rows in JSON format that can be obtained via a URL by an AJAX XHR call. Can someone suggest a website with possibly government or open-source data that can be used for such testing?
P.S. It can't just be a download of a zipped file that can be expanded into a JSON text file. I need a JSON XHR response.
P.P.S. Ideally it would have 50-75 distinct values in one of the columns so I could demonstrate a grouping/aggregation issue. Data by US State or by Zipccode within a state would be excellent.
P.P.P.S. I've been searching the internet and found this site, now trying to figure out how to get JSON instead of XML:
http://www.sba.gov/about-sba-services/7617#city-county-state
All you have to do is this:
http://www.sba.gov/about-sba-services/7617#city-county-state/NY.json

You can find a lot of open data here
free open data

Have you looked at Freebase there should be a query to get you that many rows and they offer json responses.
EDIT: Theres a similiar site DBPedia I built this query which will return JSON and has about 3k rows:
http://dbpedia.org/sparql?default-graph-uri=http%3A%2F%2Fdbpedia.org&query=select+distinct+%3FConcept+where+%7B%5B%5D+a+%3FConcept%7D+LIMIT+3000&format=json%2Fhtml&timeout=30000&debug=on
you can go here and customize the query if you need more data.
-Ken

Why not create a page with a LOOP that generate those records as you desire. Shouldn't be so hard.
Maybe a Java Servlet.

Related

Power Automate Flow: Convert json to readable PowerAutomate-Items

In CRM I have a 'Doc_Config' value.
The 'Doc_Config' value gets passed to Power Automate Flow
With the data I populate an Microsoft Word Document. My problem here is, that instead of the data the raw text is filled into the Word Document.
Is there a way to convert the raw text so Power Automate recognizes the data I actually want? Like as if it is presented to the Flow like so:
Problem: You probably have copied the path for your objects and pasted the path value in your 'Doc_Config' value. Here the problems should be with the #{...} pattern.
Solution: Please, remove #{...} pattern from any objects that you are referring to by their path as in the example below:
incorrect:
#{items('Apply_to_each_2')?['productname']}
correct:
items('Apply_to_each_2')?['productname']
Background:
In Power Automate cloud flows, you reference objects that the dynamic content tooling offers. However, sometimes you want to catch objects that the dynamic content tooling cannot see, or does not provide. At these times, you can refer to them by specifying the path for them as in the example below.
items('Apply_to_each_2')?['productname']
You can observe the path for the objects by hovering over any object that the dynamic content tooling is providing you.
Another option would be to simply parse the data from your array, as it is already JSON.
I will add three links here to Imgur images as I can't post images directly yet, but the idea is very simple:
Append your data to your array variable
Add a Parse JSON task, click generate from sample, and paste in the JSON you use.
Your actions can be used in all other steps now.
The images will clarify a lot.
Append your data
Parse your data
Use your data

Scraping BRfares for train fares

I am looking for advise. The following website
http://brfares.com/#home
provides fares information for UK train lines. I would like to use it to build a database of travel costs for seasons tickets from different locations. I have never done this kind of thing before but have experience with Python/Bash scripting and some HTML.
Viewing the source code for a typical query the actual fair information is not displayed in index.html. Can anyone provide a pointer as to how to go about scraping (a new word for me) the information.
This is the url for the query : http://brfares.com/querysimple?orig=SUY&dest=0415&rlc=
the response is a json object.
First you need to build a lookup table of all destinations codes. you can use the following link to do that http://brfares.com/ac_loc?term=. Do it for all the letters in the alphabet and then parse for a unique list.
Then you take them by the pair, execute the json query, parse the returned json and feed the data to a database.
Now you can do whatever you want with that database.

Should I use Wordpress Transient API in this case?

I'm writing a simple Wordpress plugin for work and am wondering if using the Transients API is practical in this case, or if I should seek out another way.
The plugin's purpose is simple. I'm making a call to USZip Web Service (http://www.webservicex.net/uszip.asmx?op=GetInfoByZIP) to retrieve data. Our sales team is using a Lead Intake sheet that the plugin will run on.
I wanted to reduce the number of API calls, so I thought of setting a transient for each zip code as the key and store the incoming data (city and zip). If the corresponding data for a given zip code already exists, then no need to make an API call.
Here are my concerns:
1. After a quick search, I realized that the transient data is stored in the wp_options table and storing the data would balloon that table in no time. Would this cause a significance performance issue if the db becomes huge?
2. Is this horrible practice to create this many transient keys? It could easily becomes thousands in a few months time.
If using Transient is not the best way, could you please help point me in the right direction? Thanks!
P.S. I opted for the Transients API vs the Options API. I know zip codes don't change often, but they sometimes so. I set expiration time of 3 months.
A less-inflated solution would be:
Store a single option called uszip with a serialized array inside the option
Grab the entire array each time and simply check if the zip code exists
If it doesn't exist, grab the data and save the whole transient again
You should make sure you don't hit the upper bounds of a serialized array in this table (9,000 elements) considering 43,000 zip codes exist in the US. However, you will most likely have a very localized subset of zip codes.

How can i create a segment data feature

I have a task to extend my web application to provide users the ability to segment their own data (i.e choose their own fields and add their criteria using And/Or etc), so I'm creating something similar to a query builder tool but lighter. I'm not worrying about the front end for the moment, i am just trying to focus on how to do this in the back end.
My only thoughts so far are to store their "Segment" as an XML document (serialized in the DB) which contains all of their columns and criteria and how they map to the database, then when the segment is called, i have a mapping class which deserializes this xml document and maps the fields and builds a SQL query for this and then returns the query results. The problem i see with this is if the database setup changes (likely) then i have a serialized XML document which knows nothing about these changes.
Has anyone tacked a similar situation?
I had a similar problem and posted a question on here with what could be a potential solution to your own issue.
Dynamic linq query with multiple/unknown criteria
See how you get on with that.

Use of HTTP Post method?

Below is the statement written at http://hateinterview.com/java-script/methods-get-vs-post-in-html-forms/1854.html
By specification, GET is used basically for retrieving data where as POST is used for data storing, data updating, ordering a product or even e-mailing
Whenever i used get or post method, i used them to get the parameters with getparameter() method of httprequest. I did not get in above statement, how post method is used for datastoring or dataupdation which we cant achieve with get method. Looking for a very brief example.
EDIT: Thanks all for your answers but i am specifically looking for meaning of data storing, data updating in post method apart from file loading stuff.
GET can in theory also store and update data, but it's simply not safe. It's too easy to accidently store or update data by just bookmarking it, following a link or being indexed by a searchbot. POST requests are not bookmarkable/linkable and not indexed by searchbots. Also, the GET query string has a limited length, the safe limit is 255 characters. The POST request body can however be as large as 2GB. Also, uploading files is not possible by GET.
One difference is that the GET data (in the URL, as another answer says) show up at a *nix server as the contents of the environment variable QUERY_STRING, whereas POST data appear on stdin. Regardless of how they are packaged and sent, GET and POST data are identical in format, in my experience.
#Mohit edited his question to add: "Thanks all for your answers but i am specifically looking for meaning of data storing, data updating in post method apart from file loading stuff. "
Read rfc2616, Hypertext Transfer Protocol -- HTTP/1.1, specifically Sections 9.3 GET and 9.5 POST:
"The GET method means retrieve ... information."
"The POST method is used to request that the origin server accept" information.
To be strictly compliant with rfc2616, use the GET method to read data from the server. Use the POST method to write data to the server.
The "meaning of data storing, data updating" is exactly that. How could this possibly be any clearer or more explicit?
POST sends the data in the body while GET puts the data in the URL...
For example to upload a file you use POST... since GET puts the data into URL the data is visible to the user and the length is limited.
see for example http://www.cs.tut.fi/~jkorpela/forms/methods.html
there are things you can not do with GET !first of them is with post you can upload files!
See this: Article or this one

Resources