Accessing OpenWeather data elements - json.net

I am building a simple Visual Basic program, for personal use only, that shows some basic weather forecast information. I did some looking around and decided that OpenWeather is my best option for getting the data.
I wrote a simple program, using the NewtonSoft JSON framework, which includes:
Dim myWeatherData As HttpClient
response = Await myWeatherData.GetAsync(useUrl)
responseBody = Await response.Content.ReadAsStringAsync()
myJSON = JObject.Parse(responseBody)
This code fine as I was able to see the JSON data using JsonConvert.SerializeObject. My question is simply this: Using the above code, would you please show me how to get inside of myJSON to access, for example, the predicted temperature two days from now? Thanks.

Related

How to include a JSON query string in GET request in R?

I'm new to R and coding. I'm trying to get some data by making a GET request in R and using the httr and josonlite packages. I've made some successful API calls and parsed the data into a dataframe, but now need to include a JSON query string in my URL. I've read similar questions and answers on Stack Overflow, but am still stumped as to what the best way to go about this is using R. Can I simply encode the query string or do I need to set the query statements seperately (and if so what is the format needed). Below is the URL i need to work with.
Cheers!
liz_URL <- "https://squidle.org/api/annotation_set?q={"order_by":[{"field":"created_at","direction":"desc"}],"filters":[{"name":"usergroups","op":"any","val":{"name":"id","op":"eq","val":49}}]}&results_per_page=100"
PQdata <- GET(liz_URL,add_headers("X-Auth-Token"="XXXXX"))

How to I read in the body content in .net core 3.0?

I am upgrading my solution from .net core 2.2 to 3.0, and I want to log the request data/body in the "proper" way. I am reading a lot that using the PipeReader is preferred to reading directly from the stream.
Previous to .net core 3.0, we used streams and the EnableRewind() method. It looked something like this:
HttpRequest.EnableRewind();
HttpRequest.Body.Position = 0;
var sr = new StreamReader(HttpRequest.Body);
var myData = sr.ReadToEnd();
HttpRequest.Body.Position = 0;
return myData;
I would like to understand how to properly use the PipeReader. Eg. what code is necessary to read in the HttpRequest body into a string? I see there is a ReadAsync() and TryRead() methods, but I'm not sure how to properly use these. I also see there is an AsStream() method which I've been able to use on the stream as I previously had (but without rewind).
I'd love to see any examples on how to do this, because it seems as though working with the pipe requires a great deal of pointer references. Lastly, if I work with the PipeReader.AsStream(), do I need to worry about rewind?
Here's how I do it:
HttpRequest.EnableBuffering();
using var streamReader = new StreamReader(HttpRequest.Body);
string data = await streamReader.ReadToEndAsync();
EnableBuffering() replaces EnableRewind().

Writing a function that scrapes dataset that appears only after typing in values and clicking a button

I am trying to write a function that will take a list of dates and retrieve the dataset as found on https://www.treasurydirect.gov/GA-FI/FedInvest/selectSecurityPriceDate.htm
I am using PROC IML in SAS to execute R-code (since I am more familiar with R).
My problem is within R, and is due to the website.
First, I am aware that there is an API but this is an exercise I really want to learn because many sites do not have APIs.
Does anyone know how to retrieve the datasets?
Things I've heard:
Use RSelenium to program the clicking. RSelenium got taken off of the archive recently so that isn't an option (even downloading it off of a previous version is causing issues).
Look at the XML url changes as I click the "submit" button in Chrome. However, the XML in the Network tab doesn't show anything, whereas on other websites that have different methods of searching do.
I have been looking for a solution all day, but to no avail! Please help
First, you need to read the terms and conditions and make sure that you are not breaking the rules when scraping.
Next, if there is an API, you should use it so that they can better manage their data usage and operations.
In addition, you should also limit the number of requests made so as not to overload the server. If I am not wrong, this is related to DNS Denial of Service attacks.
Finally, if those above conditions are satisfied, you can use the inspector on Chrome to see what HTTP requests are being made when you browse these webpages.
In this particular case, you do not need RSelenium and a simple HTTP POST will do
library(httr)
resp <- POST("https://www.treasurydirect.gov/GA-FI/FedInvest/selectSecurityPriceDate.htm",
body=list(
priceDate.month=5,
priceDate.day=15,
priceDate.year=2018,
submit="CSV+Format"
),
encode="form")
read.csv(text=rawToChar(resp$content), header=FALSE)
You can perform the same http processing in a SAS session using Proc HTTP. The CSV data does not contain a header row, so perhaps the XML Format is more appropriate. There are a couple of caveats for the treasurydirect site.
Prior to posting a data download request the connection needs some cookies that are assigned during a GET request. Proc HTTP can do this.
The XML contains an extra tag container <bpd> that the SAS XMLV2 library engine can't handle simply. This extra tag can be removed with some DATA step processing.
Sample code for XML
filename response TEMP;
filename respfilt TEMP;
* Get request sets up fresh session and cookies;
proc http
clear_cache
method = "get"
url ="https://www.treasurydirect.gov/GA-FI/FedInvest/selectSecurityPriceDate.htm"
;
run;
* Post request as performed by XML format button;
* automatically utilizes cookies setup in GET request;
* in= can now directly specify the parameter data to post;
proc http
method = "post"
in = 'priceDate.year=2018&priceDate.month=5&priceDate.day=15&submit=XML+Format'
url ="https://www.treasurydirect.gov/GA-FI/FedInvest/selectSecurityPriceDate.htm"
out = response
;
run;
* remove bpd tag from the response (the downloaded xml);
data _null_;
infile response;
file respfilt;
input;
if _infile_ not in: ('<bpd', '</bpd');
put _infile_;
run;
* copy data collections from xml file to tables in work library;
libname respfilt xmlv2 ;
proc copy in=respfilt out=work;
run;
Reference material
REST at Ease with SASĀ®: How to Use SAS to Get Your REST
Joseph Henry, SAS Institute Inc., Cary, NC
http://support.sas.com/resources/papers/proceedings16/SAS6363-2016.pdf

What is basic difference between UploadData and DownloadString in asp.net webclient?

I am new to webclient.
I have seen some examples to POST data to a server. I am worrying which one to be used over other. Can any one please tell me what to use when?
UploadData:
system.net.webclient.uploaddata(uri, byte[]);
DownloadString:
WebClient client = new WebClient();
var result = client.DownloadString(someurl);
Suggestions welcome..!
the basic difference between both - Uploaddata method can be used to retrieve data based on provided inputs from specified URI(address of service) while DownloadString can be used to retrieve data without sending any inputs parameters.

Using Json.NET to parse result returned by Google Maps API

I am trying to use google map api's web service to make a web request, and get the json string, and then get the latitude and longitude I need for the input address.
Everything is fine. I got the json string I need.
Now I am using Json.net to parse the string.
I don't know why, but I simply cannot convert it into a JArray.
Here is the json string
Can anyone teach me how to write the c# code to get the lat and lng in geometry > location?
Thanks
Here is my codes and the bug screenshot
You have a few options when using JSON.NET to Parse the JSON.
The best option, IMHO, is to use Serialization to pull the object back into a structured type that you can manipulate as you could any other class. For this you can see serialization in the JSON.NET documentation (I can also post more details if that isn't clear enough).
If all you want is to grab the address, as you listed in your question, you can also use the LINQ feature to pull that information back. You could use code similar to the following to pull it off (the key lies in the SelectToken method to pull back the details you need).
Dim json As Newtonsoft.Json.Linq.JObject
json = Newtonsoft.Json.Linq.JObject.Parse(jsonString)
json.SelectToken("results.formatted_address").ToString()
You can also use all the normal power of Linq to traverse the JSON as you'd expect. See the LINQ documentation as well.
[I realize this is an old question, but in the off chance it helps someone else...]
The problem here is that json["results"] is a JArray, but you are not querying it like one. You need to use an array index to get the first (and only, in this case) element, then you can access the objects inside it.
string address = json["results"][0]["formatted_address"].Value<string>();
To get the latitude and longitude you can do:
JToken location = json["results"][0]["geometry"]["location"];
double lat = location["lat"].Value<double>();
double lng = location["lng"].Value<double>();

Resources