I am building a c# console application that will run as background automated process running weekly (or may be daily) to download page level and post level insight data. I have one facebook account that manages several facebook pages, I need to get data for all of those pages.
I still could not figure out how do I get a access token from my console application and where I could send request for Page level and Post level Insights ?
From the UI I can download CSv file, can I do that using API too or I will have to download JSON string and then parse it ?
thanks for your help
Related
If I wanted to build a scraper that pings each URL on a site and stores the adobe (or Google) image request, how would I go about this? I.e. I just want something that grabs all the parameters in the URL posted to Adobe in a csv or something similar. I'm familiar with how to build simple web scrapers, but how do I grab the URL I see in for example Fiddler that contains all the variables being sent to the Analytics solution?
If I could do this I could run a script that lists all URLs with the corresponding tracking events that are being fired and it would make QAing much more manageable.
You should be able to query the DOM for the image object created by the tag request. I am more familiar with the IBM Digital Analytics (Coremetrics) platform and you can find the tag requests using accessing the following array document.cmTagCtl.cTI in the Web Console on a Coremetrics tagged page. I used this method when building a Selenium WebDriver test case and wanted to test for the analytics tags.
I don't have the equivalent for Adobe or GA at the moment since it depends in the library implementation am trying the do the same as you for GA.
Cheers,
Jamie
I have data in a Google Spreadsheet that I would like to add to a data table in Google Map Engine. Related to this, I have two separate questions:
Is there any way to do that with an apps script in the spreadsheet?
I have looked at the Server to Server Authentication (service
accounts) help page a little. Will I need to set that (or some
such thing) up to work with my apps script, or is the fact that it
is my spreadsheet and my map enough to authenticate the script by
itself? If I do need to set that up, where can I find sample
JavaScript code to accomplish the tasks they have shown in their
sample code on the help page?
Quote:
You can create and populate a new table by uploading data files
through the API
Table Upload
Quote:
To upload a file, send a POST request to the following URL
Upload Table Files
You can make a POST request with Apps Script using urlFetchApp:
urlFetchApp.fetch
Scroll down one or two screens in that documentation to see the parameters for urlFetchApp.fetch
There are various ways you could trigger the Apps Script to run. You could add a custom drop down menu to your spreadsheet that ran the data upload when the user choose a menu item.
Custom Menus in Google Apps
The Apps Script can be bound to your Google Sheet. So, I'm quite sure this can be done.
My ASP.NET Web Forms app generates a report, via a fairly complicated process: authentication, ViewState, and asynchronous retrieval of data.
I would like to give the user the option of downloading the report they are viewing as a PDF file. I'd prefer to convert the HTML to PDF, so the two cannot get out of sync due to incremental changes.
I can find .NET components which can point to a Url and generate PDF, but I'm not sure how this will work on my complex reports. Should I grab the Viewstate and Auth cookies from the user's request and pass them through the PDF generator? I could get all the settings into the Url if needed, eliminating ViewState.
Anyone have experience with this sort of setup? Suggested software?
you could save to disk and pass the HTML page you're sending to the client to a tool such as wkhtml2pdf (http://code.google.com/p/wkhtmltopdf/) and use XHR to periodically check if the file was rendered, if so provide a download link.
I have an ASP.NET application with pages that use reportviewer. Can someone give me a hint on how to approach the following requirement:
I want to get the report as PDF file from the page, without user interaction. I know I can render the report to a filestream, but since there's no user opening it in a browser, I need to collect the filestream from another application that might run during the night.
There might be other approaches, like a webservice for example that could return the filestream to me, but this would also mean, I have to modify the setup of the datasources that the report receives it's data from. There are a lot of controls on the page, for supplying filter parameters. By using the page life cycle I can use what's already there.
I thought about wget, but haven't tried it yet, and I'm not sure how complicated logging in will be with cookies. I do have full control over the asp.net application though, so if I can modify something there to make it easier, I'd do it.
You can use the "WebClient" in .net application to get the response from the site.
Please refer the below link:
http://msdn.microsoft.com/en-us/library/system.net.webclient(v=vs.80).aspx
I m developin an Online Examination System in C#.net and want to copy files on client machine as soon as exam starts, so that even if internet gets disconnected examinee can continue with test
You may wish to consider a client server solution, such as WPF or winforms as this is more suited to this type of development. You can use one click deployment to have this still launched from the web and updated on every run.
If you do decied to use asp.net this will result in a very javascript heavy site with a very slow load in the first page.
To do this you would load all your test qustions into a javascript datastructure on the first page, when every the user when to the next page you would need to, using javascript, collect all the answers and store in javascript. then rereender the entire page using your definitions of the test in javascript with no trip back to the server. then once the test was complete you would need to send your results back to the server, the internet must be active once you've compleated the test.
You'll have to create a download package and provide a link for the user to click to request the files. You can't force a download.
If your exam in all in one web page, you don't need to do anything. Once the page appears in the users browser, it has already been "copied locally".