Hosting images on Dropbox - asp.net

I'm looking for a server for hosting images from a webservice that i'm working on. This webservice will need to access the images many times, I'll upload about 4GB of images per day to show to the users. My idea is to host the images over there and get the public links to put on the HTML.
So I'd like to know if Dropbox is an adequate tool for this, because I was studying the Dropbox API and I think It doesn't offer adequate tools to get the images's public link.
Summarizing my question, are these kind of hosts for this kind of services or not?

As the other comments and answers say, a normal CDN is probably a better choice for this.
For reference though, the Dropbox API does let you get publicly shareable links to files:
https://www.dropbox.com/developers/core/docs#shares
You can also modify these links as desired:
https://www.dropbox.com/help/201
There's also a direct and temporary version:
https://www.dropbox.com/developers/core/docs#media
Note however that there are bandwidth limits on these links:
https://www.dropbox.com/help/4204
Also, the API enables you to access file content directly:
https://www.dropbox.com/developers/core/docs#files-GET

Try amazon web services http://aws.amazon.com/s3/
Lots of big websites use this for serving images. It is very fast!

Related

How can I host a website and web application on the same server using AWS?

Excuse my lack of server architecture knowledge, but I'm a bit confused on what applications, servers, environments, etc.. are and how they can communicate with each other. I just got AWS and here is what I want to do ultimately.
I want to create a Google Chrome extension. For simplicity, lets say that I'm trying to make an app that records the number of times that all users with the extension collectively visit a given webpage plus information about the visits, such as the time they visited and duration. So if I go to Facebook.com and 100 other people with the extension did, I would see an iframe, lets say, that says "100 users have been here and they visited at these times: ...". Of course, the extension also needs to communicate with the server to increase the count by one. The point is, there is no need to visit any webpage for this app to work since it's an extension and the point isn't to go to a webpage, although it still returns HTML and Javascript.
Now, I also want a homepage for the app in case people are interested in the extension for whatever reason. Just like Adblock, you don't need to go to their actual website, but it's good to have one.
My question is, how do I set this up? Do I just have a normal website, ie. www.example.com/ and set it up normally with Wordpress (what I'd like to use) then just designate one address, ie www.example.com/app, to be answered by my Python app? If so, how do I do that? What do I need in AWS? I'm familiar with Flask and have written apps on my local server using it—can that be integrated with Wordpress?
Sorry if this is confusing.
I also want a homepage for the app in case people are interested in
the extension
The simplest is to host the home page as a static website (Html, css, js) in an S3 bucket.
But if you really want WordPress, you can do that too.
For Backend web services for your plugin to talk to, you can use Elastic Beanstalk, it is a very simple way to do that, without tinkering all the components yourself.

Accessing images in a different project, but in the same solution

I made this website for a client which wanted to be able to upload images and then use those images to create some dynamic content on his site. It all works fine, but now I want to isolate that administration part (where he can add images and create his content) on a subdomain.
So at the moment, I have two projects. One where images get uploaded to, and one who has to access those images (this is my problem).
I have read multiple topics related to this issue but have not found a solution, I can never get a path outside of my current project.
The only option I am thinking right now that could work is to have some kind of API on the main website, and when an image gets uploaded to the administration site, send that file over to the main site, but that seems pretty overkill knowing that my images will be on the same server.
Can this be done?
What is the cleanest/best way to achieve this?
Please note:
Saving images to the database is not an option. Uploading files on the server and then only storing the path is so much faster.
My images get uploaded at run-time, I can't use anything that relies on resources/compilation-time.
Thanks!
UPDATE (SOLUTION)
Rather than saving in the database only the name of the file (image), for example "image1.png" and then trying to retrieve the path in the other project, I ended up saving the absolute URL in the database so that I could then use that URL directly.
public static string ResolveServerUrl(string serverUrl, bool forceHttps)
{
if (serverUrl.IndexOf("://") > -1)
return serverUrl;
string newUrl = serverUrl;
Uri originalUri = System.Web.HttpContext.Current.Request.Url;
newUrl = (forceHttps ? "https" : originalUri.Scheme) +
"://" + originalUri.Authority + newUrl;
return newUrl;
}
This will give you a URL that looks like http://yourdomain/path/to/image.jpg, so you can save it directly in the database and use it as is in the other project.
The only option I am thinking right now that could work is to have some kind of API on the main website, and when an image gets uploaded to the administration site, send that file over to the main site
I think you just kind of answered your own question. That is indeed the way to go, or I should say you're on the correct direction towards a enterprise SOA architecture...you are still far from it. But, this is a good start where you start to realize that your system is growing and demanding a more robust architecture
but that seems pretty overkill knowing that my images will be on the same server.
This is a false statement because if you design it well, you can easily scale out to a different server and platform without affecting your client app(s). Let's say that in the future, the content is moved to its own server, you will only make the pertinent modifications to your "Content Service" while your client apps will not need to be changed at all, they're still pointing to the same endpoint and will never notice what's happening with the internals of the "Contents Service". What this means is that your client apps only care about getting content from the "Contents Service" without knowing where the content is actually hosted, whether in a Windows Server, a Linux server, a Sql Database, an Oracle database, in the US or China. It's not the responsibility of the client app(s) to care about how the content is handled, instead they only need to know how the content is served
Hope it makes sense. I could provide you with some links explaining the absolute benefits of such architectures

Storing CSS files on Windows Azure

I'm working on my first Windows Azure application and I'm wondering how people go about managing CSS & JS files within their apps?
At the moment my CSS and JS are just part of my cloud app so every time I make a small CSS change the app needs to be redeployed which isn't ideal. Is it best practice to remove those components from the cloud app and deploy them elsewhere? If that is the case where is the best place to store them? Inside a cloud storage account using blobs or something else?
Bear in mind that if you put your assets in storage, each time there is a page request that includes a link to storage, it counts as a storage transaction. Currently, they are priced at $0.01 per 10,000, so it would take a while to be costly. But if you have 2 CSS files, 2 JS files and 4 images on a given page, that's 8 transactions per page request.
If you get 1000 page requests per day * 30 days that's 240,000 per month / 10,000 = $0.24. Not a big deal if your page requests stay low. But, if your site is even remotely higher traffic, it can start to add up quickly.
Yeah, throw your assets into a public container in storage and build absolute urls to the storage account container from the web app (use a helper method). This way you can just sparsely upload assets as they change.
Next step would be to expose the container over the CDN to get the distributed edge caching too.
We store our JS and CSS in blobs with the Azure CDN and it works great.
A completely different 'solution' might be to check out:
http://blogs.msdn.com/b/windowsazure/archive/2011/07/12/now-available-windows-azure-accelerator-for-web-roles.aspx
I personally haven't used them yet but they're supposed to let you alter/update your web role projects without needing to redeploy the entire thing.
I am not sure if this will work as easily as you might expect for CSS files if they are being referenced from a different domain.
CSS files that are hosted on a different domain might be blocked by the browser. See Cross-Origin Resource Sharing: http://www.w3.org/TR/cors/ However I am not sure if this is widely implemented.
An alternative might be to use a handler which forwards requests for the CSS files on your server to the blob.

How to index a web site

I'm asking on behalf of somebody, so I don't have too many details.
What options are available for indexing site content in an ASP.NET web site? I suspect SQL Server's Full Text index may be used if the page content is stored in the database. How would I index dynamic and static content if that content isn't stored in the DB, but in html and aspx pages themselves?
We purchased Karamasoft Ultimate Search several years ago. It is a search engine add-on for your web site. I like it because it is a simple tool that taught us searching on our site. It is pretty inexpensive and we knew we could buy later if we needed more or different features. We needed something that would give us searching without having to do a lot of programming.
Specifically, this tool is a web crawler. It will run on your web server and it will act like an end-user and navigate through your site keeping a record of your web pages, so when a real users searches, they are told the pages that have the content they want.
Keep that in mind it is acting like an end-user, so your dynamic data is indexed right along with the static stuff because it indexes the final web page. We needed this feature and it is what appealed to us the most.
You can use a web crawler to crawl that site and add the content to a database which then is full text indexed. There are a number of web crawlers out there.
Lucene is a well known open source tool that would help you here. The main branch is Java based but there is a .Net port too.
Main site: http://lucene.apache.org/
.Net port: http://incubator.apache.org/lucene.net/
Having used several alternatives I would be loath to do anything other than Google Site Search.
The only reason I use SQL Full Text Search is to search through multiple columns. It's really hard to implement it in any effective manner.

Advice for setting up a static file server with ASP.NET/ASP.NET MVC

I would like to set up a subdomain (similar to stackoverflow's http://sstatic.net/) in order to serve static content for my existing web applications. I have never done this before and was wondering if anyone has advice - which technology to use (i am using the Microsoft stack), how i should structure the static site, what are the security and caching considerations etc.
ANY advice would be appreciated,
Thanks in advance
Not to state the obvious, but if it's truly static, why do you need ASP.NET? This question has some advice on optimizing IIS as a static file server. If people do need to be authenticated to view static content, that will obviously complicate it slightly. sstatic.net does not use authentication.
I suggest you check out Amazon's S3 and Cloudfront services. Both are low cost and high performance. Their focus is serving up content.
I'm a happy S3 customer.
Added: You can easily set their services up so they appear as a subdomain of your site. Eg assets.yourdomain.com

Resources