I am developing a C# ASP.NET 4.0 application that will reside on a Windows Server 2003. By mean of accessing this application through a network computer, any user would be able to upload files to the windows server. But also, once these files are stored on server, he/she would be able to copy these files from the windows server to another networked computer.
I have found a way to upload files to a specified location on the server disk,
but now I need to send these files that are on server disk to the client computers.
My question is: Is there any way to send or copy files from server to other client computers (not the one that is accessing the web service) without needing a program recieving those files on the client computers? FTP, WCF, cmd commands, sockets?
Any idea?
If you want users of your webapp to download files, I'd look into an "ashx generic handler." It will allow you to send files back down to clients over HTTP(s).
If you are looking to have remote users, tell your webserver to copy files to other servers ON THE SAME LAN AS THE SERVER, you would write using normal System.IO operations.
Over a LAN, if you have the correct permissions and so on, you can write to a disk on a different machine using File.Copy -- there's nothing special about that.
If we're talking about remote machines over the internet, that's a different story. Something has to be listening whether it's FTP, WCF, DropBox, etc.
If the problem is that it can be painful to get something like WCF to work from a client due to problems like firewall issues under Windows 7, you could take a different route and have the client periodically ping the server looking for new content. To give the server a point of reference, the ping could contain the name or creation date of the most recent file received. The server could reply with a list of new files, and then the client could make several WCF calls, one by one, to pull the content down. This pattern keeps all the client traffic outbound.
You can if you can run the program as an account that has access to that computer. However having this sort of access on your network that would grant access to the outside world to put an unfiltered file on your internal network is just asking to be hacked.
Finally, I decided to install a FileZilla FTP server on each client computer and my page is working very well. But another option is to create a work group in the windows server and put every client computer to work in this work group, so that Windows server have access to the computers in the same work group.
Here are some links that may help to create the work groups:
http://helpdeskgeek.com/networking/cannot-see-other-computers-on-network-in-my-network-places/
http://www.computing.net/answers/windows-2003/server-2003-workgroup-setup-/1004.html
Related
An SFTP client such as CuteFTP or Filezilla provides a rick user interface for an SFTP server. These are clients that are installed locally on the user's PC. Instead of a client installed at the user's side, is it possible to set up a web-based user interface on the SFTP server, so that a user with only a browser is able to access the files on the server? Are such open source or commerical products available that can be deployed on the SFTP server for enhancing the file transfer experience?
Note: The base server needs to be SFTP as there will be scripts that clients will be using to transfer files in a non-interactive manner. For interactive usage, I am looking for a web interface that be serve as an add-on.
I suggest you check out: https://filebrowser.org/features
There is another good one called "droppy" but its no longer active and apparently has too many git forks at the moment to know where it will go.
The question you need to answer is the application value/scope.
You can always go with Dropbox or something too depending on the user's intended application.
I have an WCF Webservice project, built in my local machine, which when hosted using test client and triggered, returns values from remote database in JSON format.
For example, if you key in the URL with localhost then you get results back in the below format:
{"Id":3,"Value1":"67.5687","Value2":"126.7125"}
I want to host this project on a remote server with a public URL, which should return the above results back from any network. I have 3 question regarding this:
** What modifications should I do to my current WCF project to host it on remote server.
** Given the various types of hosting like :
1) windows process activation services (WAS)
2) IIS
3) Self hosting
4) Hosting in a Windows service,
which type of hosting is best suited for hosting on remote server.
** What changes should I make in my App.Config file (including the change in my endpoint address from localhost to IP address) to make the service work.
Thanks.
1) You shouldn't need to make any changes to your project just because you want to host the code on another machine. I find this an odd question.
2) Given your choice of JSON as data format and a browser as test client, I'm guessing you want to make it available over HTTP using simple GET requests. In the Microsoft stack, IIS is the web server, and the natural choice for this scenario.
3) It is quite impossible to answer. I don't know what's in your app.config today. I don't know if you're going to authenticate, and if so how. And I don't want to know! That said, it seems to me if everything is supposed to behave as it does on your dev box, the bindings are already ok. I don't remember if a WCF service needs to know about the endpoint it is itself at (hard to see why it would need to know this, really); I would have thought it more natural to do such configuration on the host, e.g. IIS. The client of course should use a different endpoint pointing to wherever you host the service. (You can put many endpoints in app.config and let the user choose one, btw.)
I think most of us sin against the following advice now and then, but it is the best advice I can give: Read a book. Learn as much as possible about the thing you're using, in this case WCF. You'll get the time back later, and your software will be less bad!
I have been developing an ASP.NET application and I would like my work colleagues to give me feedback on it. I tried running it on IIS but because the database is located on a remote server, I am unable to host it properly?
Also, how can the colleagues access that site? my host name, etc
You can update your connection string to your remote server (providing the DB accepts external requests).
Something like (assuming SQL)
Data Source=190.190.200.100,1433;Network Library=DBMSSOCN;Initial Catalog=myDataBase; User ID=myUsername;Password=myPassword;
It's an ugly hack, but you can do an if else statement, detect the DomainName (Environment.UserDomainName()) and if matches the server then use one connection string, if it's your office's domain, then use that!
This means, you can still use your local IIS!
You can publish your web site on a local server which has IIS running on it. You can create your application on IIS and your colleagues can access your web site from the intranet via the name of the server easily.
Such as:
http://servername/yourprojectdirectory
And also the local server should have internet access for the remote database.
I'm trying to place a .mdf database on a Computer A and access it simultaneously on Computer B but I'm having an error that says I cannot access the mdf file because it's being used by another process.
The setup is, the database is hosted on a Public folder in Computer A. I have Visual Studio running on both computers, and Computer B accesses the database on Computer A.
Computer A Connection path string:
C:\Users\Public\database.mdf
Computer B Connection path string:
\192.168.254.8\Public\database.mdf
Is there some sort of setting that I do not know of to enable multiple access on a db? Or this is not possible?
EDIT:
Let me rephrase my question, sorry.
There are two computers connected via a network, and I want to access one Visual Studio solution/project website. I wanted to do this as a demonstration, is it possible?
You should have an instance of SQL Server (even express) running on the host machine.
All requests from the clients should use the standard mechanisms for connecting to that server instance. In other words, they won't run their own copy of SQL Server, instead they will connect to the instance running on the host machine.
Simultaneous or direct access to Database files under any DBMS control either a very bad practice or simply impossible.
Use proper DBMS tools to access the data.
And yes - its possible to access Web site project for demo.
Install SQL Server on the host machine (the one that you want to keep your database on).
Then in Visual Studio, use the "Server Explorer" to locate the remote instance of the database. From there, you can utilize the remote instance of SQL Server in your connection string, thus allowing you to connect multiple computers to a single database.
Using two instances of SQL Server to connect to the single .mdf file is a very bad practice.
Instead of specifying the MDF file in your connection string (which, I believe, uses a form of User Instancing), you should use SQL Server Management Studio (SSMS) to attach the MDF. Then connect to SQL Server, rather than to the MDF.
You may need to move the MDF and LDF files to a place where the SQL Server identity can access them, rather than keeping them in your VS project.
Most projects keep scripts to create the DB in their VS project, but not the binary MDF itself.
When you deploy, the process is similar: use SSMS to attach the MDF on your server.
Yes, If 2 or More Computers are connected by networks like LAN You can access the Database Database1.mdf (say) from another SQL Server by entering its user name / Password /by Visual studio connection.
Disadvantages:
If more than 7~8 systems are accessing same Database same time System response slows down and Hangs up or may result error like connection pool exceeded or Device not responding..
In Big Companies they use Huge Multi-core & Multi-Processer based sever which will respond faster.
Overcome: while writing Applications Use Connection Open and Close Particularly while executing the SQL queries in other time it should be closed.
Best Programer uses Transaction instead of using normal methods. Transaction Makes Every User Access Database With valid and successful Execution of queries. These Queries (Transaction Queries) mainly executes on client machine and later updates the same over Main server when it got Time slot for it(synchronize).
I'm thinking about configuring the remind calendar program so that I can use the same .reminders file from my Ubuntu box at home and from my Windows box at work. What I'm going to try to do is to make the directory on my home machine that contains the file externally visible through webdav on Apache. (Security doesn't really concern me, because my home firewall only forwards ssh, to hit port 80 my my home box, you need to use ssh tunneling.)
Now my understanding is that webdav was designed to arbitrate simultaneous access attempts. My question is whether this is compatible with direct file access from the host machine. That is, I understand that if I have two or more remote webdav clients trying to edit the same file, the webdav protocol is supposed to provide locking, so that only one client can have access, and hence the file will not be corrupted.
My question is whether these protections will also protect against local edits going through the filesystem, rather than through webdav. Should I mount the webdav directory, on the host machine, and direct all local edits through the webdav mount? Or is this unnecessary?
(In this case, with only me accessing the file, it's exceedingly unlikely that I'd get simultaneous edits, but I like to understand how systems are supposed to work ;)
If you're not accessing the files under the WebDAV protocol, you're not honoring locks set via LOCK and UNLOCK methods and therefore will open to potential to overwrite changes made by another client. This situation is described in the WebDAV RFC here: https://www.rfc-editor.org/rfc/rfc4918#section-7.2