Where does WebMatrix save the site configuration? - asp.net

I have a website saved in Dropbox folder and I successfully worked on it for days without any problems with publishing. When I opened that site today in WebMatrix on another computer, I had to configure publishing settings again off course. I did that and tried to publish the site with only one file modified, but I was surprised when I saw all files in the publish dialog marked for upload.
One thing came to my mind - to copy site configuration from first to second computer so that second computer has information about already uploaded files and continues to publish just the modified ones, but I don't know if the site configuration is stored in file or registry or something else...
So, before I start digging I decided to ask the wise ones here :)

I found it in following location:
C:\Users\Username\Documents\IISExpress\config\PublishUI

When you try to make a new publish setting there is a link where you can say you want to import old settings. This looks for files with an extension called .PublishSettings or .XML, so I would start by searching your pc for files with that extension. I would imagine the .xml file would have your sites name in it's filename, so that is worth a shot as well :)
This should be enough. If not I am looking forward to hearing what others say or you can dig up yourself.

Related

Rooted Path and FileUpload Control

I know it's been asked and I have read the posts and Googled this all day. Still nowhere near something that works. Using an .aspx page, I need to upload a .pdf file to a specific website. I'm doing development using VS2017 and VB.Net. The app will run on different websites. It needs to upload client files to a specific different website and path. Also, the file name of the uploaded file will not be the same as the local source file. Creating the new name is no problem.
Let's say a local file must be uploaded to a website at https://www.appfileserver.co.za/pdfdocs, but I'm on https://www.myownsite.com. So, when using FileUpload1.SaveAs(rootedpath) the path that goes in there must be the rooted path to the target. What would the rooted path look like for the example I provided?
FYI, I know the IP addresses, http paths and anything else I need to know because I control those sites. It would be great to do an FTP upload. I have done this many times from desktop apps. Unfortunately I'd need the full path to the local file. It seems there is no way a web page is allowed to get that full path, so FTP upload is out - or is there a way?
After battling for two days trying to FTP upload from website to website (which is not possible because server firewalls block this), I finally solved it. The solution was a simple one. I deployed the upload .aspx file on the target server then embedded that in an iframe on the client machine apps. The files are then uploaded one time to the right place. Simple and 100% effective. Hopefully somebody see's this and understands it - so as to avoid the troubles I had.

Drupal Site configuration issues

Good day every one,
I am new in DRUPAL.
I am having problem with the drupal site.
I got the repository for the drupal site. I have successfully clone it and got every things and the database. I have uploaded the database to the local host server.
Now, I can see that initially the directory is like this
C:\wamp\www\test\site\docroot\sites\default\
Then when i first open the site through local host the directory automatically becomes like this
C:\wamp\www\test\site\docroot\sites\default\file
The "file" directory contains the empty folder of css, images etc/
which I believe is downloaded from the database for the first time.
The site is giving many console error like missing images etc.
Instead of having the empty folder in file directory there must be images and css files and everything I do not know what is wrong becs the folder should not be empty there must be files and and image sand css files and should be downloaded from the database when I first open the site.
Please help me to locate the problem.
Thank you very much.
Usually, you will put on git drupal core, modules and theme files.., basically everything except the files uploaded by user (admin). Those files are usually located at:
/sites/default/files
So, since they are not on git repo you need to copy them to your local environment from the working site (i.e. over (S)FTP).
If your "file" is not "files" dir I'm talking about then it's something specific to your site - don't know nothing about it.

Unreal file in Cpanel

I see an unknown tar file in cpanel file manager. The thing that is bothering me whether or not I delete this file.
Recently I found some malicious files in my wordpress website and the hosting provider sent me a message to look over the whole website. Can anyone tell me if this unreal file was responsible or not?
Unreal file
This is a unusual filename (with tar.gz). This isn't wordpress file also. I think is better to delete this file.
The file seems very odd and is not part of any installation.
It is also uploaded in your Home Directory - which no application does.
If you have not uploaded the folder yourself you should indeed delete it.
Afterwards it is nice to go through your access log files to see if there aren't any POST requests made to your website that seem odd.
You can also look through the file called .lastlogin in your cPanel's Home Directory - it contains all IP addresses that have accessed your cPanel account.

Backing up ASP.net website code files - to a backup folder under the website folder

I want to backup my existing ASP.net web app before updating it.
Therefore I create a backup folder inside the website (ie same level as App_Code, web.config). Call it something like Backup_20110910
Then I move all the current website files/folders (excluding web.config, app_data) into the backup folder.
Then I extract the zip of the latest code in the now clean folder.
Is there any potential problems with this approach? As after all, you are increasing the number of csharp files in your website folder, could there be conflicts etc.
I wouldn't back up within the folder structure, there's a possibility that someone then finds your backup folders and browses to them, running the older code. If you zip it then you suddenly have files someone can download too. Even more amusingly if, as a lot of people do, when you change web.config you rename the old one to web.config.bak a lot of security scanners look for that because now it can be downloaded, as it's no longer a .config file, but a .bak.
Backup outside the web root, not within and all of those worries will go away.
There won't be an issue - except that it might become confusing to have identical folder structures within the current folder structure - it's always wisest to keep backups completly seperate from the current build

Download existing file from IIS results in File does not exists (404)

I have a full working web site that i ported to a new hosting company.
In some pages i have links to PDF on the server (they do exist!)
On the old server no problem.
On the new one when user clicks on the link : error 404 file does not exist...
Should i look in the web.config ? i don't know where to start
thanks
John
Start from the file read permissions.
You need to read the log files, or the event viewer to see whats really is the problem.
This is probably as simple as the files not being in the exact relative location to the page that they used to be in - e.g. there was a folder /pdfs in the root of the web where all the files were, now they are just in the root folder, and the links were not updated.
You've not said which version of IIS you're using. However for IIS5, this has been answered over at ServerFault -- see https://serverfault.com/questions/79094/serve-pdf-fies-in-iis
It should be similar for IIS6. It's possible your hosting provider may have revoked the MIME type so IIS no longer recognises it.
What you may end up needing to do if your hosting company isn't forthcoming is write a "file provider" page that takes the file to download on the query string (obviously with some sanity checking so folk can't request any old file), then just writes it out, bypassing what IIS would do normally.

Resources