I'm thinking about configuring the remind calendar program so that I can use the same .reminders file from my Ubuntu box at home and from my Windows box at work. What I'm going to try to do is to make the directory on my home machine that contains the file externally visible through webdav on Apache. (Security doesn't really concern me, because my home firewall only forwards ssh, to hit port 80 my my home box, you need to use ssh tunneling.)
Now my understanding is that webdav was designed to arbitrate simultaneous access attempts. My question is whether this is compatible with direct file access from the host machine. That is, I understand that if I have two or more remote webdav clients trying to edit the same file, the webdav protocol is supposed to provide locking, so that only one client can have access, and hence the file will not be corrupted.
My question is whether these protections will also protect against local edits going through the filesystem, rather than through webdav. Should I mount the webdav directory, on the host machine, and direct all local edits through the webdav mount? Or is this unnecessary?
(In this case, with only me accessing the file, it's exceedingly unlikely that I'd get simultaneous edits, but I like to understand how systems are supposed to work ;)
If you're not accessing the files under the WebDAV protocol, you're not honoring locks set via LOCK and UNLOCK methods and therefore will open to potential to overwrite changes made by another client. This situation is described in the WebDAV RFC here: https://www.rfc-editor.org/rfc/rfc4918#section-7.2
Related
An SFTP client such as CuteFTP or Filezilla provides a rick user interface for an SFTP server. These are clients that are installed locally on the user's PC. Instead of a client installed at the user's side, is it possible to set up a web-based user interface on the SFTP server, so that a user with only a browser is able to access the files on the server? Are such open source or commerical products available that can be deployed on the SFTP server for enhancing the file transfer experience?
Note: The base server needs to be SFTP as there will be scripts that clients will be using to transfer files in a non-interactive manner. For interactive usage, I am looking for a web interface that be serve as an add-on.
I suggest you check out: https://filebrowser.org/features
There is another good one called "droppy" but its no longer active and apparently has too many git forks at the moment to know where it will go.
The question you need to answer is the application value/scope.
You can always go with Dropbox or something too depending on the user's intended application.
I run Apache over HTTPS and can see in the log file that a HTTP/1.1 request is made for every single file of my repository. And for every single file, the full URL is disclosed.
I need to access my repository from a location where I don't want sysadmins to look over my shoulder and see all these individual URLs. Of course I know they won't see file contents since I am using HTTPS or not HTTP, but I am really annoyed they can see URLs and as a consequence, file names.
Is there a way I can hide or encrypt HTTPS urls with SVN?
This would be great, as I would prefer not having to resort using svn+ssh, which does not easily/readily support path-based authorization, which I am heavily using.
With HTTPS, the full URL is only visible to the client (your svn binary) and the server hosting the repository. In-transit, only the hostname you're connecting to is visible.
You can further protect yourself by using a VPN connection between your client and the server, or tunneling over SSH (not svn+ssh, but an direct ssh tunnel).
If you're concerned about the sysadmin of the box hosting your repository seeing your activity in the Apache logs there, you have issues far beyond what can be solved with software. You could disable logging in Apache, but your sysadmin can switch it back on or use other means.
Last option: if you don't trust the system(s) and/or network you're on, don't engage in activities that you consider sensitive on them. They can't see something that isn't happening in the first place.
I have a requirement to add a new send port to a send port group in my BizTalk 2013 application. The send port should send to a File location with the destination folder set to \\ResearchServer\ResearchTeam\R&DReports
Although I can set the file handler settings to this path I can't save the send port. This is because BizTalk automatically generates a URI of \\ResearchServer\ResearchTeam\R&DReports which contains the invalid character & (ampersand).
Obviously one way to "fix" this would be to rename the folder but this would have serious consequences for other apps and users who access this folder (and is, quite frankly, avoiding the issue rather than finding a fix).
I've also considered creating a dynamic port and then setting the destination at runtime. However this seems extremely complicated and would require downtime on a live server.
I've looked on TechNet and StackOverflow for ideas and it seems I'm the first person to encounter any issues with ampersands in BizTalk file destinations.
Any ideas, please help!
I am developing a C# ASP.NET 4.0 application that will reside on a Windows Server 2003. By mean of accessing this application through a network computer, any user would be able to upload files to the windows server. But also, once these files are stored on server, he/she would be able to copy these files from the windows server to another networked computer.
I have found a way to upload files to a specified location on the server disk,
but now I need to send these files that are on server disk to the client computers.
My question is: Is there any way to send or copy files from server to other client computers (not the one that is accessing the web service) without needing a program recieving those files on the client computers? FTP, WCF, cmd commands, sockets?
Any idea?
If you want users of your webapp to download files, I'd look into an "ashx generic handler." It will allow you to send files back down to clients over HTTP(s).
If you are looking to have remote users, tell your webserver to copy files to other servers ON THE SAME LAN AS THE SERVER, you would write using normal System.IO operations.
Over a LAN, if you have the correct permissions and so on, you can write to a disk on a different machine using File.Copy -- there's nothing special about that.
If we're talking about remote machines over the internet, that's a different story. Something has to be listening whether it's FTP, WCF, DropBox, etc.
If the problem is that it can be painful to get something like WCF to work from a client due to problems like firewall issues under Windows 7, you could take a different route and have the client periodically ping the server looking for new content. To give the server a point of reference, the ping could contain the name or creation date of the most recent file received. The server could reply with a list of new files, and then the client could make several WCF calls, one by one, to pull the content down. This pattern keeps all the client traffic outbound.
You can if you can run the program as an account that has access to that computer. However having this sort of access on your network that would grant access to the outside world to put an unfiltered file on your internal network is just asking to be hacked.
Finally, I decided to install a FileZilla FTP server on each client computer and my page is working very well. But another option is to create a work group in the windows server and put every client computer to work in this work group, so that Windows server have access to the computers in the same work group.
Here are some links that may help to create the work groups:
http://helpdeskgeek.com/networking/cannot-see-other-computers-on-network-in-my-network-places/
http://www.computing.net/answers/windows-2003/server-2003-workgroup-setup-/1004.html
I just got IIS7 set up on a Windows Server 2008 R2 machine in VirtualBox. After doing so, I could not connect from any other client, though http://localhost worked. For that matter, I was unable to even ping the server.
After doing some research, I found that enabling File and Print Sharing on the server solved the problem, but surely there has to be a better way, and I would much prefer to learn to use the best method, rather than the easiest one.
What, specifically, should I do to enable both pinging of the server as well as access to the web server running on it?
Isn't it that the inbound web HTTP port is blocked by default? I'm not a server guru but can remember going to the firewall to allow it through. Should already be there.
Out of the box on Windows Server 2008/2008R2 firewall is installed and users cannot access resources or services on the server unless you configure exceptions to the firewall. There is one exception to this are services/resources on this server that you make available through the GUI tools (Initial Configuration Tasks Wizard, Server manager) - these automatically create firewall required exceptions for you.
So in your case either upon File and Print Sharing installation or upon using File and Print Sharing config wizard/Shared resource provision wizard (most likely the later) required firewall exception was created for you. The rule in question is: File and Printer Sharing (Echo Request – ICMPv4-In) - actually allows ping, but I guess Windows also uses it for network resources discovery and other things implied by the role you installed.
Nothing prevents you from not enabling File and Print Sharing and just enabling mentioned firewall exception manually.