I have several log files in a webDAV (salesforce SFDC logs repo) and I want to drop all of them to a linux server, so that I can feed them to splunk. I am not sure if there is a better idea than this, please advise.
Related
When developing for asp.net using visual studio for web, it is really convenient that when you deploy your website to the web server, it will cleverly check which files have been changed and only upload those files. In addition, if you deleted some files from your source, it detects that too and deletes those files from the web server since they are no longer needed.
I started developing with the LAMP stack and am wondering how you can deploy to a web server in a similar way.
I tried using Filezilla and on copy/pasting the source files to the web server, you have these options if there are similar files:
-Overwrite
-Overwrite if source is newer
-Overwrite if different size
-Overwrite if different size or source newer
"Overwrite if source is newer" works, kind of, but it only checks the date modified, not the content of the file. Also, the above method does not delete files from the web server that were deleted from the source.
Is there a better way to do this with Filezilla? (or maybe use some other program?)
Thanks.
You can use rsync to accomplish this.
When you want to push out changes you would do something like this form your production server.
rysnc -av user#<developmentIp>:/web/root/* /production/web/root/
The pattern is rsync --flags [user#host:]/source/dir [user#host:]/destination/dir
You only need the user#host stuff for remote hosts. The user must have ssh access to the host.
Couple little suggestions.
The command can be run from the source or destination. I find it better to run the command from the destination, for permissions issues (i.e. your reading from the remote and writing to the local)
Do some tests first, I always mix up the directory stuff; do I need the end slash, should I use the star, ...
Read the man page, there are alot of available options that may be helpful (--delete, --exclude, -a)
The autogenerated PrecompiledApp.config is causing me some headache.
Im automating the deployment of an older web site and 50% of the time when I deploy I get this error:
System.IO.IOException: The process cannot access the file '\\web.prod.local\c$\Sites\Website\PrecompiledApp.config' because it is being used by another process.
Content:
<precompiledApp version="2" updatable="true"/>
To the best of my knowledge websites uses some shadow copy feature to allow updating the site "runtime", with things such as app.config etc.
However this 1 file seems to be an exception.
Can anyone suggest a workaround besides stopping the website while deploying?
Kind regards
Judging by the path in the error message I see you're trying to copy the files over a network share while deploying. This is bad practice to update the files directly over a network share or FTP etc. And this is the reason, actually. Network deployment is slow and while some files are still being updated/uploaded - the ASP.NET on the server is already trying to recycle the app, copy the files to "Temporary ASP.NET folders" etc. etc. etc.
Deployment best practice:
ZIP your precompiled site, upload, then run UNZIP on the server remotely
Here's how you run UNZIP remotely:
plink -ssh -l USERNAME -pw PASSWORD web.prod.local c:\Sites\Website\unzip -q -o c:\Sites\Website\site.zip -d c:\Sites\Website\
"plink" is a free SSH tool for windows (command-line) you need it on your dev machine
"web.prod.local" is your server address.
"c:\Sites\Website\" is the path your website on the server
You need SSH installed on your server to run commands remotely, the simplest option is too install the free tool: "freesshd" (google it)
Drop "unzip.exe" on the server as well, you see it's being called right there. Simplest way is to drop it right into the c:\Sites\Website\
PS. This is just an example, you can come up with your own solution
I have a file on my desktop and I need to get it onto another server, but I have no means of getting it there, i.e. email/usb or any way like that.
The server is on the same network as me.
I have heard of a way that the file can be copied via the command line.
Would anyone have any information on this and if so could you please help me?
Not sure whether you have command line access to that server or not? If yes, are you accessing it via telnet or via ssh?
If ssh, you should be able to transfer the file via SCP (secure copy), since it uses the same ssh connection you use to get your cli. If you want to transfer your file from a Windows environment, you may want to look at WinSCP, else do a man scp on your Linux or Unix server and, assuming you have it, you'll get the hang of it... it's not complicated.
If ssh is not an option, then you depend on the server having some service available for you to transfer the file, most obvious one being FTP.
Does that help?
Is it possible to have a webapp/webservice to fetch data from my pc?? Can anyone give me an idea if it is possible....
This will depend on how this data is going to be exposed from your PC. If you want to read some files from the file system, the server hosting the web application will need to have access to those files. For example you could create a network share and read from this share: byte[] data = File.ReadAllBytes("\\yourpc\somefile.dat");
yes, you can have your localhost server too. install wamp, or xampp server, create your php script that gives you data, which can used in you mobile application.
How can I read a text file resides in a remote machine? There is no share exists in that machine and I am not allowed to create any share or file in the remote machine. Also I am not allowed to run any client program in the remote machine. My program is a ASP.net in C# residing in a IIS webserver. For linux machine we used ssh connections and file reads are easy. Is there something by default available in windows similiar to it ?
Thanks,
Sreejith
The first question to ask is if there's a good business reason to read that file. If yes, the IT people will have to allow you a reasonable solution to the problem.
I have frequently used SFTP (secure FTP) for this kind of problem. Unfortunately SFTP is not part of Windows, but there are free and low-cost SFTP servers available. Here's a list from Wikipedia
Explain to IT why you need access to that file and discuss options including SFTP. If you have a valid business reason for this and they will "not let you because of policy", it's the job of your project manager or boss to clear out that roadblock. Ask them to help.
Finally, consider whether it's practical for the file on the remote machine to be pushed to you instead of you pulling it. If you can setup a file share on your PC, ask them to setup a job on the remote server that copies the file to your file share every time it is changed.
You could try accessing the Admin share of the machine. Windows by default created a share for all disks (named C$, D$ etc). But in that case the application you write should be running with the credentials of a user with rights to that share ((local) administrators have sufficient rights to do that).
If that doesn't work you need to create a share or install software to get files from that machine (like FTP). This is all because of security, it's a good thing you are not able to just read a file from any machine...
I have done this many time with the Remote File port 34
http://en.wikipedia.org/wiki/List_of_TCP_and_UDP_port_numbers