Search Ada Lib for open a file over http - ada

I'm searching for a Library which I can include in a program to open a file with a given internet address. Just like http://foobar.com/foobar.txt.
Like Ada.Text_IO.Open (File, Ada.Text_IO.In_File, "bla.txt");
but which is not limited on local files.

Well, you aren't liable to find something with that exact interface, as Text_IO is a standard library and can't easily be extended by third parties in that way.
If your platform's underlying filesystem were to support HTTP, then it would work just like you want. I don't know of any platform that works that way however.
What you probably want as a general solution is AWS (Ada Web Server). A person could use that to implement a full-blown web server if they want, but it also contains HTTP client facilities. The HTTP client would be what you want (see AWS.Client). It would be a bit more work on your part than just making one standard API call, but probably no too much work.
Here's an example, cribbed from Rosetta Code:
with Ada.Text_IO;
use Ada.Text_IO;
with AWS.Client;
with AWS.Response;
procedure HTTP_Request is
begin
Put_Line (AWS.Response.Message_Body (AWS.Client.Get (URL => "http://www.rosettacode.org")));
end HTTP_Request;

Having used and implemented several HTTP clients, I would advise you to use an established and dedicated client. There are many intricacies to the HTTP standard that are not handled by naive implementations https://www.rfc-editor.org/rfc/rfc2616.
Consider using the Ada Bindings for a mature library like libCURL; http://curl.haxx.se/libcurl/ada95/

Related

Looking for a good method to transfer critical real time data over internet

I am searching for a good method to transfer data over internet, and I work in C++/windows environment. The data is binary, a compressed blob of an extracted image. Input and requirements are as follows:
6kB/packet # 10 packets/sec (60kBytes per second)
Reliable data transfer
I am new to network programming and so far I could figure out that one of the following methods will be suitable.
Sockets
MSMQ (MS Message Queuing)
The Client runs on a browser (Shows realtime images on browser). While server runs native C++ code. Please let me know if there are any other methods for achieving the same? Which one should I go for and why?
If the server determines the pace at which images are sent, which is what it looks like, a server push style solution would make sense. What most browsers (and even non-browsers) are settling for these days are WebSockets.
The main advantage WebSockets have over most proprietary protocols, apart from becoming a widely adopted standard, is that they run on top of HTTP and can thus permeate (most) proxies and firewalls etc.
On the server side, you could potentially integrate node.js, which allows you to easily implement WebSockets, and comes with a lot of other libraries. It's written in C++, and extensible via C++ and JavaScript, which node.js hosts a VM for. node.js's main feature is being asynchronous at every level, making that style of programming the default.
But of course there are other ways to implement WebSockets on the server side, maybe node.js is more than you need. I have implemented a C++ extension for node.js on Windows and use socket.io to do WebSockets and non-WebSocket transports for older browsers, and that has worked out fine for me.
But that was textual data. In your binary data case, socket.io wouldn't do it, so you could check out other libraries that do binary over WebSockets.
Is there any specific reason why you cannot run a server on your windows machine? 60kb/seconds, looks like some kind of an embedded device?
Based on our description, you ned to show image information, in realtime on a browser. You can possibly use HTTP. but its stateless, meaning once the information is transferred, you lose the connection. You client needs to poll the C++/Windows machine. If you are prety confident the information generated is periodic, you can use this approach. This requires a server, so only if a yes to my first question
A chat protocol. Something like a Jabber client running on your client, and a Jabber server on your C++/Windows machine. Chat protocols allow almost realtime
While it may seem to make sense, I wouldn't use MSMQ in this scenario. You may not run into a problem now, but MSMQ messages are limited in size and you may eventually hit a wall because of this.
I would use TCP for this application, TCP is built with reliability in mind and you can simply feed data through a socket. You may have to figure out a very simple protocol yourself but it should be the best choice.
Unless you are using an embedded device that understands MSMQ out of the box, your best bet to use MSMQ would be to use a proxy and you are then still forced to play with TCP and possibly HTTP.
I do home automation that includes security cameras on my personal time and I use the .net micro framework and even if it did have MSMQ capabilities I still wouldn't use it.
I recommend that you look into MJPEG (Motion JPEG) which sounds exactly like what you would like to do.
http://www.codeproject.com/Articles/371955/Motion-JPEG-Streaming-Server

What is the best method to send data from a device to a server

I am currently developing a website for an energy-monitoring company. We are trying to send high volumes data from the devices which record the data to a server so they can be processed in a database. The guy developing the firmware seems to think that the best way to send the data is to produce CSV files and send them via FTP. A program on the server needs to monitor the files received via FTP and run a PHP script to process them. I, however, feel that the best way of sending the data is via HTTP POST.
We had HTTP POST working and then I began trying to work with the CSVs which became a pain as reliably monitoring the files received via FTP meant editing the ProFTPD configuration file (which I found to be a near impossible task) and install a package called mod_exec (which comes with security risks) so that ProFTPD could run a PHP script. These issues and the fact that I am unfamiliar with the linux console which I am required to use extensively to set this up, makes the CSV method very difficult to set up. HTTP POST to me seems like a more direct way of sending the data without having to worry about files or relying on ProFTPD. It would also allow us to use identifiers to give the data being passed meaning as opposed to a string of values for which the meaning is not immediately apparent. In addition, the query string could be URL encoded to pass a multidimensional array which would work well given the type of data being passed.
Nevertheless, just because the HTTP POST method would be easier doesn't mean that the CSV method doesn't have advantages. Furthermore, the firmware guy has far more experience than me with computers so I trust his opinion.
Can you please help me to understand his point of view on the advantages of the CSV method and explain what the best method is?
You're right. FTP has major issues with firewalls, and especially doesn't work well on mobile (NAT'ted) IPv4. HTTP POST works far, far better under such circumstances, if only because nobody accepts an "internet" connection that breaks HTTP.
Furthermore, HTTP is a lot easier on the device as well. It's just a single-socket protocol, with trivial read/write semantics on that socket.
Some more benefits? HTTP has almost-native support for compression (gzip). HTTP transmission can start before the input is complete. HTTP is easier to secure (HTTPS)...
No, there really is little reason to use FTP.
The 'CSV method' (I'd call it the 'FTP method' though) has the advantage of being known to the embedded developer. The receiving side will have to create some way of checking if there is a file though. That adds complexity.
The 'HTTP method' has several advantages:
HTTP is easy to implement on the sending side
No need to create a file-checker
You can reply to the embedded device if everything went OK
I actually just implemented a system just like that (not too much data, but still) and use HTTP POST to send the data. I implemented the HTTP POST myself.

how would the different programs execute/call each other and exchange information

Java Servlet can be used as part of a hybrid solution (i.e. website using many different programming languages.) But how a hybrid solution would work, i.e. how would the different programs execute/call each other and exchange information?
Please don't use code to explain. Thank you very much!
There are many ways you can 'exchange data.' It all depends on the your application.
You could use XML to define your data, as many programming languages support digesting/serializing XML. You could also use JSON (more for javascript), or direct socket connection as mentioned above.
I have seen C# Client applications that write java byte code to send to server implemented in java. The answer depends on the specific details of your needs.
This can be done in many ways.
Java can call other languages that run in the JVM directly, or languages outside the JVM by using JNI.
Portions of such a system could also be running as separate processes with communication by methods such as web-services or sockets.

How strict to be when using Qt framework?

I'm building a Qt application that needs to use libssh, a SSH client library. libssh (understandably) performs its own network connections, however Qt has its own infrastructure for network connections (QTcpSocket etc).
Should I worry about these differences? Should I be trying to make libssh make network connections via QTcpSocket... Or if it works fine on the platforms I'm targeting, is that good enough?
The only downside is that you have another library that your code depends on.
The primary rule though is if it works, go with it.
I think it depends on how the abstraction you get from libssh looks like. If it is a socket-like API, you could create an QAbstractSocket implementation for it. If it is just some structure or handle to read from and write to, you could create a QIODevice subclass. Most I/O can be implemented generically operating on QIODevices (instead of explicitely operating on QFile, sockets, etc.).

HTTP server that handles requests through IO devices?

This question can probably only be answered for Unix-like systems that follow the "everything is a file" idiom.
Would it be hard to create a web server that mounts local devices for handling http traffic? It would enable a program to read raw http requests from /dev/httpin (for example) and write the responses to /dev/httpout. I think this would be nice because it would allow me to create a http handler from any programming language that is capable of handling IO streams.
I don't really know where to start on this. Any suggestions on how to setup such a system?
I agree with Javier, check out standards such as FastCGI, Rack, WSGI, or the inchoate OWIN. These interfaces exist basically so that disparate software components can exchange HTTP messages on the same machine, without using network sockets.
Personally, if I was looking to do this, I'd use my favorite JVM library, Restlet (probably with Groovy or Jython) and implement a custom ServerConnector which would construct a Request object and then pass it off to the rest of the framework. This sort of flexibility of architecture is why I like Restlet.

Resources