How can I load local images from a website? - http

I'm making a gallery site for a client, to be used for internal use. It is browsing hi-res images, so there's literally over 1 GB of images, each 3-4 MB, so loading the images through the web isn't an option due to load time.
My idea was to store the images on each machine locally, but maintain a central database online so all machines are in sync, and load the images using "file:///C:/images/file.jpg". But apparently browsers don't allow a website to load files from the local computer (for obvious security reasons).
How can I get around this?
Do I have to create a browser plugin myself to get access to the file system?
Alternatively, is there a better way to achieve my goal of (a) a centralized database of images and data, but (b) images stored locally?
Thanks for any advice you can offer.

You can store your images in your centralized database, but it would be of interest to also store smaller, resized images so that if the user is interested in the smaller version, s/he can click it, or hover over it, and have it load the larger version. 3-4MB isn't that insane for most computers to load, so long as the page isn't trying to load them all at once.
To get access to the file system, you can use the web-host's file access links, or you can use an FTP client, given that you know the FTP username/password.

Related

How to block anyone downloading web page from browser using Ctrl+S or through browser download option?

I am trying to restrict the user from downloading the page as .html or .aspx file from browser.
Or is there a way to change the content of file if its downloaded?
This is a complex area, with lots of moving parts. The short answer is "there is no way to do this with 100% success; there are a few things you can do which make it harder".
Firstly, you can include JavaScript to disable the right click context menu. This doesn't stop Ctrl+S, but might discourage casual attempts.
Secondly, you can use DRM in the browser (though this is primarily aimed at protecting media content. As browser support is all over the show, this isn't realistic right now.
Thirdly, you could write your site as a single page web application, and build some degree of authentication into the "retrieve content" logic. This way, saving the page to disk wouldn't bring the content along, just the "page furniture". However, any mechanism you include to only download content when you think you should is likely to be easily subverted by anyone who is moderately motivated.
Also, any steps you take to stop people persisting your pages locally are likely to break the caching mechanisms on which the internet depends for performance, so your site would likely be dramatically slower.
No you can't stop them.
Consider how the web actually works here: once the user has visited your website and loaded your page into their browser, they have already downloaded it - the web page was transmitted from your server to their computer and appeared on their screen.
All they have to do then is click the Save button to keep it permanently on their disk. That doesn't involve downloading it again, it just copies the page data from a temporary folder to a permanent one. Of course it's also possible for people to use another HTTP client (i.e. not a browser, but maybe an existing program, or some code they wrote themselves) to visit the URL of your page and save the returned contents.
It's not clear what problem think you would solve by stopping people from saving pages. Saving the page is something done within the browser - you as a site developer don't control the user's browser, so you can't prevent that. And if you stop them from downloading your page in the first place then - by definition - you also stop them from using your website...which kind of defeats the point of having one :-).
If you've got some sort of worry about security, you'll have to clarify exactly what you are concerned about, and maybe you can get advice about a sensible way to deal with it.

Publising Umbraco pages in development environment differs across clients

We're working on an Umbraco site - multiple development machines using a shared development database.
When one developers makes changes in the CMS to content and does a Save and Publish the change is reflected on his machine but not other development machines.
This doesn't seem to make sense as we're all looking at the same database.? We've tried doing an IIS reset to see if it's caching at work but this doesn't seem to make a difference either.
Any ideas what on earth could be going on?
Umbraco does a lot of caching, so it doesn't have to hit the database all the time. Normally, all of the published content is cached in an xml file at App_Data\umbraco.config. You just need to have your developers right click on the root of the content tree in the umbraco backoffice and click "Republish the entire site" to regenerate that xml cache on disk from the xml cache in the database.
You also might need to reindex your examine indexes. You can normally find the "Examine Management" dashboard on the developer section in the backoffice of umbraco. By default, there are three indexes: InternalMember, Internal, and External. Unless you have membership going on in your umbraco site, you can ignore that index. The External index is used mostly for site searches. The Internal index is much more critical. It is used to cache media. I believe it is also used in the backoffice, but I'm not 100% certain. Make sure that the Internal index is regenerated.
Remember that media files are stored in the /media directory by default. That means if developer 'A' uploads a file, the physical file won't show up on developer 'B's machine automatically.
I'll bet you there's some cool ways to set up load balancing to handle a caching for your dev setup. I'm pretty sure there are also ways to store the media in the database, so you don't have to worry about transferring them back and forth.

will the use of multiple subdomains speed up my website?

i am considering moving my images to a subdomain on my website, and i read somewhere that moving the script to a different one would make it even faster! is it really true? or should i just leave it at what it is if i am not considering a real CDN?
Yes and no. The site itself won't be faster, but it may load faster in most browsers and thereby it may seem faster.
The reason is that most browsers limit themselves to a set maximum of concurrent connections to a domain. Say you have your site on www.mysite.com. Now when your browser tries to download your html, css, scripts and images it may need to download 20-30 files from the server. Since the browser limits itself to, say 4, concurrent connections to your domain the browser will have to download only 4 files at one time.
Now if you serve your css files on a separate subdomain css.mysite.com, your images on images.mysite.com and scripts on scripts.mysite.com your browser can open 4 concurrent connections to each of the domains. Hence it can download up to 16 files at the same time. If your banwidth allows it this may cause the page to load faster.
So your site may appear to be faster for the visitor, but the reason will be loading times, not any speedup of code or database access.

How would you allow users to edit attachments in a web application?

We have created a web application, using ASP.NET, that allows users to upload documents and attach them to business entities, like customers, contacts and so on.
The application runs on the intranet and all files are uploaded through the web application into a shared folder on the server.
I would like, right from the web page, for the user to open the actual file, edit it and then save the changes back to the original location. This is a piece of cake in a Windows environment, I'm just wondering what, if any, is the best way to handle this in a web environment?
The files are usually Word documents, Excel documents and images.
Clarification
We would display all the attachments in a list format. We would like it so that the user would click on an edit link and the file would be opened in the appropriate application, for example, Microsoft Word or Microsoft Excel. I think the file associations in Windows would already handle this. We are just trying to save our user the time to download the original file, make their changes, delete the old file, and the upload the new file.
SharePoint does this by exposing FrontPage extensions which Word and Excel know how to deal with.
If you want to look at a commercial product for ASP.NET that allows you to edit images with AJAX (no need for installed software), I work for a company that has one (Atalasoft)
WebDAV is probably what you want. (Free)
If all your client computers are Windows, map a shared folder on the server to the same drive letter on every client and use the file:// format.
Let's say you share \ServerName\ShareName to H: on every client's computer, the you can make the link as file://h:\pat_to_the_file_under_your_share\fileName.doc
If not every one of the client's computers are in Windows, then you might try to make your links as follows (not sure if ot works):
file://\ServerName\ShareName\pat_to_the_file_under_your_share\fileName.doc
I'm trying to do something with using file:// instead of http:// but it's real sporadic based on the browser. Seems to work fine in IE, okay in Firefox, and goes nowhere in Chrome.
Looks like I may just be stuck with downloading, editing, and re-uploading the document.
It sounds like you want something similar t eRoom, where the browser works in conjunction with a component that intercepts a stream from http, stores it in a temp folder, then fires up Word or Excel and allows you to edit the stream.
You may have to create a component that will intervene and create a temporary local copy of the file.
This tool should do what you need.
http://www.dlitools.com/dlitools/dlitoolsHome.nsf/0FA6B8B31F831F468525736B0001C606/4BBD7E8684EA8DB78525754E006C63A3?OpenDocument

How to prevent DOS attacks using image resizing in an ASP.NET application?

I'm currently developing a site where users can upload images to use as avatars, I know this makes me sound a little paranoid but I was wondering what if a malicious user uploads an image with incredibly large dimensions that will eat the server memory (as a DOS attack), I already have a limit on the file size that can be uploaded (250 k) but even that size can allow for an image with incredibly large dimensions if the image for example is a JPEG that contains one color and created with a very low quality setting. Taking into consideration that the image is uploaded as a bitmap in memory when being resized (ie. not compressed), I wonder if such DOS attacks occur, even to check the image dimensions it has to be uploaded in memory first, did you hear about any attacks that exploited this? Am I too worried?
The dimensions should be able to get at without loading the entire image map into memory? Maybe you can find out more on the issue at wotsit.org.
You have to validate that image files really ARE image files. The issue isn't an attack on your server. The issue is someone uploading an ActiveX control instead of an image. This file then downloads and installs and ruins every Windows machine that does the download.
The threat is not to you. The threat is that you will become a carrier for a virus.
You must validate each file to confirm that it is a real image file. You can check dimensions and what-not if you want. Most image-processing libraries can read the headers off the image, check the dimensions and number of pixels and what-not.
Often, folks make thumbnails from images, you can do that, also, once you've opened the image.
DoS may or may not be an issue - it depends on if someone decides to target your site.
However, for your site to scale to 1000s of concurrent users, you may consider handling the image processing in a separate process.
When the image processing is handled by page code, you run the risk of exhausting: memory, CPU, or ASP.NET threads --- the bottleneck depends on your server configuration.
Possible solution:
User uploads image.
Image is saved to shared directory.
Image path is saved to a queue in database.
Page returns with message "thanks for uploading, your avatar will be ready soon".
A Windows Service* wakes up periodically and checks the database queue.
The service resizes any images waiting in the queue, saves the outputs to a shared directory, and removes them from the queue.
Service updates the database indicating that the user's avatar is ready. So, next time they visit their profile page, they are shown the resized image.
*Ideally, the Windows Service runs on a separate server from the web server, which could be scaled up to meet future demands.
Whether this effort is worth it depends on your expected traffic. You could use load testing tools to script and simulate these actions, to see if your site can handle the load.
I think you should simply check the image dimensions. Having only a few formats, this isn't that hard and you can then easily filter large images out. Usually sites where you can upload avatars tell you to not only have a image smaller than a specific file size, but also give image dimension borders, so it's usual to check this.

Resources