Problem with local uploads to ASP.net website via CMS - asp.net

I have a website built by ASP.net and have the admin web interface (mysite.com/cms) through which I manage the website content and upload local files from my computer normally. Since we migrated to another host, this feature stopped working and after I checked with the site developer he directed me to check the FTP permissions with the host, knowing that I can upload files using the FTP normally but this does not give me all the CMS page features. I returned to the host and he rest all the users permissions and I waited for some time for action to reflect but still nothing changed! I'm lost here and don't know what else could be the root cause for this problem since I have no experience in this field. May be worth mentioning that as a workaround we set the CMS to accept attachments or files as hyperlinks instead (to upload it via FTP and then insert the link (file path) in the CMS).

Related

Download publication files and copy on the client's internal server with ASP.NET core mvc in IIS

There is a site on the client's internal server. I want to add a page to it without publishing it manually on the client's computer, that is, put an update button and download the publication files from the Internet and replace the current publication on the client's internal server.
Like updating the program on the desktop, is this possible or not?
The problem is that the previous publication is used and Windows does not allow deletion to replace the new publication on the previous publication.
This is to reduce the cost of support and labor.
Well, unless you add this code to their web server to do the download, then no, you can't just download file(s) to MY computer from the web, and THEN place them anywhere on my computer, or my computer network.
If you could mess around with my computer from just a browser?
Then while you come to my web site to view some cute cat pictures, then I am going to mess around on YOUR computer, grab a file called my banking, or grab your emails from outlook, or grab a file called my passwords.
In other words, if you could do as you ask, then no one would EVER trust and use the internet again!!!
So, YOU are NOT allowed to mess around with my computer. Hands off - leave my computer alone!!!
When you download a file from a browser, then the USER has to choose where the file will be saved (useally my-downloads).
Now, you can supply a default file location, but it will depend on the browser, and in most cases all you can do is provide a file name, and not a path + folder name.
I mean, what is the path names and folder names if I am using your web site with a iPad or smartphone?
so, a web browser is "sand boxed", and VERY restricted. You can't for example select or set a file name used for up-loading (since as noted, while you view cute pictures on my web site, if my web site can start messing around with files on YOUR computer, then that ability would speed the end of secuirty on the internet.
So, you can certainly "offer" files to download in a web browser, but the user will be in control of the location of that file in near all cases.
Now, you could have a process or program you setup and run on their web server, and it could say every hour check for files on your web site for download, it could then download the file(s), and then place them in the correct folders for that working web site.
But, any old user just hitting the web site? You don't really have control over the file location and being able to place code and content on my computer. (hey, why not come to my site to view cat pictures, and while you doing that, I can install remote desktop or any software that would allow me to control YOUR computer!!!!).
If you come to my web site, do you REALLY want my site to then mess around with the file system on your computer, and place all kinds of content, files, programs and whatever on YOUR computer from you deciding to visit my web site?
I don't think so!!!!
Now, it is possible you want this kind of ability and don't care about security, but the rest of the world and people who use the internet would not agree with granting you that ability!!
now, you could certainly provide them with a desktop program that they install on their workstations.
Once that program is installed, then it could certainly pull content from your web site, and then place such content on their web server running on that same internal network.
So, you have to create + install + have a program you develop for those in that company. Such a program could then download the content from YOUR web site, and then place it on their internal web server into the correct folders.
I suppose you could also have them each time download some content and some program (maybe in a zip file), and again after they download, they would have to run that file/program.
However, the ability to JUST download files and place such files into any location on their computer(s) at that company is simply not possible, and if web browsers had this ability, then no one would ever use and trust the internet again.
And if you going to all that trouble to build some program they could download and run? Why not place that software on their web server that runs every hour and checks for content on your web site, and downloads it automatically anyway?
That way, no users would have to go to the web site to download content, you have some software on the web server that runs every hour to check and download such content.

How can I change permission for an Azure Web App, so I can actually upload files via FTP to it?

How can I change permission for a Web App, so I can actually upload files via FTP to it?
What is it I need to change in order to have permissions to upload via FTP?
I have a running site that runs ASP.Net Core. Everything works.
For debugging purposes on a weird js issue, I need to be able to edit a few js files directly on the site via FTP.
When I connect via FTP (with the credentials from the "publish profile") I can connect just fine and download files - I use Filezilla.
But if I try to upload anything, I get "550 Access is denied."
I have full access to Azure Portal etc for the site, incl. Kudu.
It does not matter if you can upload files via KUDU or some other thing - I specifically need FTP.
Thanks for asking question! Could you please check if your firewall is blocked outgoing FTP writes.
Also, make sure you're not trying to write to a read-only file. For this suggest using the Kudu Console (https://[sitename].scm.azurewebsites.net/DebugConsole) to look at your files and check their permissions (e.g. using 'attrib' command).
For more information about KUDU, please refer to this document
Refer to this document link might be helpful: https://learn.microsoft.com/en-us/azure/app-service/deploy-ftp#get-ftp-connection-information
Try checking the ftp link which is provided by deployment center, For this Go to your web app, click Get publish profile, choose the Publish URL under tag <publishProfile profileName="your-webapp"

How to improve my pushover process

Currently, in order to push my website live I upload files to the server via FTP using FileZilla. If a user reloads the site when I'm pushing over the website DLL, they'll get a File is being used by another process type of error.
Are there any better pushover techniques I can make use of to get around this issue or any techniques that are generally better than using at FTP client to upload my site?
You can always upload an app_offline.htm file while you are deploying the new site. If you do this in an asp.net application the user will be directed to the app_offline.html file no matter what page they try to load/reload. When you're ready for them to access the site again you simply remove/delete/rename the file (i usually just rename it so it's a simple rename to get the site back into offline mode).
http://weblogs.asp.net/dotnetstories/archive/2011/09/24/take-an-asp-net-application-offline.aspx

making changes to an asp.net site without reuploading the whole thing

I'm building a site and for the moment, when I want to put it on the web server, I to go Build > Publish WebSite to a local directory and then on the FTP, I delete the whole existing content and then upload the new fresh content that's on my local directory. In the Publish Web Site popup, I see that there's an option for "Allow this precompiled site to be updatable".
If I make changes to some files in my AppCode directory, how do I update the server WITHOUT essentially shutting it down?
Thanks.
Unless you're using a precompiled site or a web application, all "regular" websites are updateable -- App_Code as well as content.
Precompiled sites can be made to be updateable, but I believe only for web pages, not for code files.
If you have a busy site with lots of updates it's possible that updates can break things until they're complete. You can work around that by creating a file called app_offline.htm at the top level of your site. That will effectively take your site offline as long as it's there. The contents of the file are sent to users instead of your active content, such as *.aspx files. When the update is complete, then remove the file.
Try to take a look here: http://msdn.microsoft.com/en-us/sync/bb840038
All that you need is a Synchronization between VS and server.

non asp.net resources authetication and authorization in iis6

I'm creating a website which besides other tasks will play some recorded files. these recorded files are on a remote server with private ip address, so I've created a virtual directory which points to a share directory on the mentioned server.
now I'm able to playback the files using client side controls like wmplayer. BUT the problem is sound file urls are accessible without any authentication and authorization.
is there anyway to enforce .net authorization and authentication (in web.config) on this virtual directory? I also should mention I can not use solutions like httphandlers to download the files because file are streamed using iis so user could navigate on the file without downloading all of it)
thanx
Open IIS (I suppose you use IIS7.0 or later). Find the mentioned virtual directory and click on it. In the listed features find Authentication, right click on it and press Open Feature. Then disable anonymous authentication for this folder. Does the problem persist?

Resources