I'm working with some new SCADA software, which uses a browser environment to display everything. One component that the software has is a PDF viewer, however, since we're in a browser environment, it can only load files that are served up over HTTP. According to the forums, this means that the source of the PDF needs to be a URL.
The forum also notes that I can use one of their modules (WebDev) to "stream the PDF bytes over HTTP", and provides directions for how to do so. However, the WebDev module is outside the budget of my project (it's quite a high-powered module, I'd be paying a premium price and then using 1% of its functionality). So I'm wondering if it's possible to serve up a PDF via HTTP some other way.
I'm not an experienced programmer - I'm self taught out of necessity on a small handful of languages, and to a basic level only. As such, I don't fully understand the problem, nor do I know what search terms to use to find the sort of information I need to solve it.
If anyone's able to provide a partial solution, or even just able to help me understand what I'm asking for and where to go looking for some answers, I'd appreciate it!
The PC hosting the PDF files and the SCADA gateway is running Windows 10.
I had the same issue integrating .pdf report to our SCADA system having web interface and running node.js at backend.
The main point is:
Generate your pdf in client end (web interface)
Convert it to Base64 format as URi
Preview on DOM or send it to server!
Send excel and pdf to server side
hope that helps!
Is there any way to access webkit JavaScript and HTTP errors that happen when capturing a page with CutyCapt? I'm trying to debug a thumbnail capture for JavaScript generated documents.
This is not currently possible. I would recommend using a tool like Wireshark to debug any HTTP errors that might occur. It would be easy to add tracing code for network replies to CutyCapt and there are some patches on SourceForge and/or GitHub that do that. I am not sure what could be done to trace JavaScript errors, but ordinarily there should not be any that do not also occur when a page is loaded in a browser. Qt comes with a sample application that implements a rudimentary web browser that could be used for this purpose. I think it also comes with the standard debugger, but I might be misremembering that.
I have a project which needs a little bit of web-scraping. The main requirement is to let the user enter his data on a java application. Then the application will connect to a data entry website then it will automatically inputs the data entered by the user to that website. I haven't started to code it since I don't know where to start. I already conducted some research about this and it points me to jsoup and desktop api(jsoup for webscraping and desktop api for opening a browser). Hope to receive a reply from the Java experts here.
Thanks!
JSoup will certainly do the scraping for you. However you need to handle HTTP (GETs/POSTs etc.) and for that I would recommend Apache Http Components.
I'm not sure you want to open a browser. Rather I would expect you to ask the user for input (perhaps via a Swing UI, or a browser-based UI) and then talk directly to the website using HTTP. I don't think you'd need to open a browser to the destination website. If you do, then check out Watij, which allows you to drive a browser directly from Java.
Dear developer friends,
I have developed a self-hosted API in ASP.Net MVC4 (e.g. http://blogs.msdn.com/b/henrikn/archive/2012/03/01/file-upload-and-asp-net-web-api.aspx), because I needed a solution where I could upload super large files. This works smoothly.
Now I want to upload files to my newly written API through the Bluimp JQuery Upload component.
This works fine, except for some small flaws:
- the progress-bar is not showing
- JQuery raises an error: Unsafe JavaScript attempt to access frame with URL...
It seems clear that this error is raised, because my API runs on another port than the web application, and ajax calls cannot be made over cross domains / ports.
I have already added the forceIframeTransport: true parameter to the fileupload component call. This does some good - without it I cannot upload files at all (because the component tries to upload with an ajax call).
So.. I figured that if there's some way to run my self hosted API on the same port as my web-application (with explicitly defined routes), my world will be happy and shiny again. However.. I'm not quite sure whether this is possible at all..
Unfortunately a proxy from my ASP.Net application will not help me here, as I wrote the API to avoid the IIS limitations (regarding maximum upload size). Using my self hosted API as proxy might to the job, but I think this is a bit.. overkill?
Anyone? Thanks in advance!
Yahoo! Solved it!
var config = new HttpSelfHostConfiguration("http://localhost:49302/api");
In other words - the web api ONLY works when the url starts with /api. All other requests are picked up by my MVC4 web application. But they run on the same port.
So to answer my question: yes it's possible. Just add a root directory.
We have a web application that uses AJAX to talk to an ASP.NET web service. We would like to write another version that can be used offline. We need to be able to re-use our existing code as much as possible. What approaches should we consider?
The app is currently using XmlHttpRequest to get dynamic data from the server. Obviously the offline version will not be able to talk to the server, but it does need to talk to something! I'm sure installing IIS or Cassini on the client would work, but I was hoping for a simpler solution. Is there no other way for JavaScript to talk to some external code?
There are plenty offline web apps nowaday. It simply evolve from AJAX.
For example:
WoaS (wiki on a stick / stickwiki), Tiddly Wiki,
Google doc and Gmail is going to be offline.
You don't need a webserver to run these webapps in offline mode. Just store the required data, scripts on the client side (usually as XML).
One of the possibilities would be to use Cassini. This is a web server that acts as a host for the ASP.Net runtime. You can host Cassini in a Windows application or a Windows Service. In this scenario you do not have to rewrite the web app and the web service.
Most other solutions do require a rewrite of both your web app and your web service. Depending on the way you have written the existing app you can reuse more or less code.
Have you considered HTML5 with application cache and offline storage?
If you hope to create an "offline" version of your package your biggest issue by far will be the need to install your site into a local copy of IIS (registering a virtual directory, etc.). I pursued this briefly a few years ago and gave up in frustration. It can be done: a number of software vendors such as DevExpress do this so you have local copies of their demonstration projects. Indeed, I was able to do this. The problem was the classic "it works on my computer" syndrome. There was simply no way to guarantee that most of my end-users had anywhere near the technical proficiency to make this work.
Thus, I would strongly recommend that you not pursue this path unless you have very technically proficient users and a huge support staff.
But there is one more very important question: did you abstract all data access code to a DAL? If not, then you have a lot of work to do in managing data access as well.
Update: user "Rine" has recommended Cassini. I just wanted to let you know that I pursued Cassini and another 3rd-party web server as well. I think that there are licensing issues with Cassini but may be wrong - it has been awhile. However, I do distinctly remember running into barrier after barrier with this approach and very little documentation to help me out.
if you want a web application run offline, you need a webserver (IIS for ASP) bound to the localhost (127.0.0.1) address. After this so can access your web application by typing http://127.0.0.1/ in your web browser the same way as you do online.
If your AJAX relies on XMLHttpRequest's, you can:
Make the static versions of XML's you get over XMLHttpRequest and put then into a folder on disk.
Rewrite your XMLHttpRequest URL's so that they point to files on disk.
Rewrite your XMLHttpRequest's so that they don't check status (it's always 0 for the file:// protocol.
All JScript works on file:// pages as well as on http:// ones.
Of course it's not the best way to develop static pages, but it may save you some time on rewriting.
I havent come across any framework specifically built for asp.net like the ones available for PHP or RoR.
Here is a good article by Steven to get you started with HTML 5 and ASP.Net Creating HTML 5 Offline application
Obviously the offline version will not be able to talk to the server, but it does need to talk to something!
Enter HTML5 LocalStorage. It works like a database and enables you to put data on your client. Indeed you have to rework parts of your code in javascript and transmit it to the client, but then it would work offline.
Local Storage works like this:
- Setter: window.localStorage.setItem(KEY, VALUE)
- Getter: window.localStorage.getItem(KEY)
- Remove: window.localStorage.removeItem(KEY)
To get the main page working offline you need to create a manifest. This is used to store complete sites on the client. Please refer to this for more information about manifests:
http://diveintohtml5.info/offline.html
You want to build a web application to work offline?? It can't be done.
You could split the interface code from the rest (in diferent dlls) and create a windows application to mimic the behaviour of your web application. This way you have 2 distinct user interfaces but the same code for business rules and data access.
I don't really see any other way...