Accessing images in a different project, but in the same solution - asp.net

I made this website for a client which wanted to be able to upload images and then use those images to create some dynamic content on his site. It all works fine, but now I want to isolate that administration part (where he can add images and create his content) on a subdomain.
So at the moment, I have two projects. One where images get uploaded to, and one who has to access those images (this is my problem).
I have read multiple topics related to this issue but have not found a solution, I can never get a path outside of my current project.
The only option I am thinking right now that could work is to have some kind of API on the main website, and when an image gets uploaded to the administration site, send that file over to the main site, but that seems pretty overkill knowing that my images will be on the same server.
Can this be done?
What is the cleanest/best way to achieve this?
Please note:
Saving images to the database is not an option. Uploading files on the server and then only storing the path is so much faster.
My images get uploaded at run-time, I can't use anything that relies on resources/compilation-time.
Thanks!
UPDATE (SOLUTION)
Rather than saving in the database only the name of the file (image), for example "image1.png" and then trying to retrieve the path in the other project, I ended up saving the absolute URL in the database so that I could then use that URL directly.
public static string ResolveServerUrl(string serverUrl, bool forceHttps)
{
if (serverUrl.IndexOf("://") > -1)
return serverUrl;
string newUrl = serverUrl;
Uri originalUri = System.Web.HttpContext.Current.Request.Url;
newUrl = (forceHttps ? "https" : originalUri.Scheme) +
"://" + originalUri.Authority + newUrl;
return newUrl;
}
This will give you a URL that looks like http://yourdomain/path/to/image.jpg, so you can save it directly in the database and use it as is in the other project.

The only option I am thinking right now that could work is to have some kind of API on the main website, and when an image gets uploaded to the administration site, send that file over to the main site
I think you just kind of answered your own question. That is indeed the way to go, or I should say you're on the correct direction towards a enterprise SOA architecture...you are still far from it. But, this is a good start where you start to realize that your system is growing and demanding a more robust architecture
but that seems pretty overkill knowing that my images will be on the same server.
This is a false statement because if you design it well, you can easily scale out to a different server and platform without affecting your client app(s). Let's say that in the future, the content is moved to its own server, you will only make the pertinent modifications to your "Content Service" while your client apps will not need to be changed at all, they're still pointing to the same endpoint and will never notice what's happening with the internals of the "Contents Service". What this means is that your client apps only care about getting content from the "Contents Service" without knowing where the content is actually hosted, whether in a Windows Server, a Linux server, a Sql Database, an Oracle database, in the US or China. It's not the responsibility of the client app(s) to care about how the content is handled, instead they only need to know how the content is served
Hope it makes sense. I could provide you with some links explaining the absolute benefits of such architectures

Related

Programmaticaly spoofing an http script request in an iframe

I'm building a backend admin system which edits json files that control the look and feel of the main site. I want to add a 'preview' button before the user hits save. To do that, I want to use the main site, but instead of calling the actual json file in production, save a temp version of it and redirect this user's traffic for that file to the temp file - from the original site code.
i've considered both chrome pluggins, configuring iframe somehow or, in worst case scenario, grabbing the production front-end, parsing out the call to the prod json file and replacing with new temp json file. That is obviously not ideal as it would entail a lot of work and if anything changes on the prod site, this will have to be updated.
I would love your ideas!
Do you have access to the main site's source code? You could implement a preview option from the main site which accepts a GET parameter and uses a temporary JSON setting based on this GET parameter.
From the backend admin system's point of view, it's just a matter of adding the JSON as part of the ajax GET request.
Unfortunately though, there is no easy way of doing this if you don't have access to the main site's source code or if you can't reach out to whoever maintains that main site.
Your cleanest option might be to recreate the main site's look and feel instead and pass it off as a "preview".

Meteor.js - Template Permissions

This has been asked in similar forms here and here but it seems pretty important, and the framework is under rapid development, so I'm going to raise it again:
Assuming your login page needs to face the public internet, how do you prevent Meteor from sending all of the authenticated user templates to a non-authenticated client?
Example use case: You have some really unique analytics / performance indicators that you want to keep secret. You've built templates to visualize each one. Simply by visiting the login page, Meteor will send any rando the templates which, even unpopulated, disclose a ton of proprietary information.
I've seen two suggestions:
Break admin into a separate app. This doesn't address the issue assuming admin login faces the public internet, unless I'm missing something.
Put the templates in the public folder or equivalent and load them dynamically. This doesn't help either, since the file names will be visible from other templates which will be sent to the client.
The only thing I can think of is to store the template strings in the server folder and have the client call a Meteor.method after login to retrieve and render them. And if you want them to behave like normal client templates, you'd have to muck around with the internal API (e.g., Meteor._def_template).
Is there any more elegant way to do this?
I asked a similar question here:
Segmented Meteor App(s) - loading only half the client or two apps sharing a database
Seems to be a common concern, and I certainly think it's something that should be addressed sometime.
Until then, I'm planning on making a smaller "public" app and sharing the DB with an admin app (possibly in Meteor, possibly in something else, depending on size/data for my admin)
These 2 packages try to address this issue:
https://atmospherejs.com/numtel/publicsources
https://atmospherejs.com/numtel/privatesources
It uses an iron-router plug-in to load your specific files on every route.
The main drawback I see here is that you must change your app structure, as the protected files need to be stored in /public or /private folder.
Also you are supposed to use iron-router.

Adobe Flex, loading a remote swf

I have a flex app running on my server.
I have had a request from some clients to have the swf loaded on their server, so that their customers dont have to be transferred to my server to login; i.e. from the user's point of view it looks like they are logging in from theirsite.com instead of mysite.com
I tried something really simple, and that was to give them a html wrapper to host on their site. The only modification that I made was to change the "src" var to:
"src", "https://www.mysite.com/app/myapp.swf"
and
embed src="https://www.mysite.com/app/myapp.swf"
To my surprise, this worked perfectly. And best of all, the service calls still seem to come from mysite.com, so I dont have to bother with modifying the crossdomain.xml file.
All good it seems.
Are there any issues or downsides to the above that I should be aware of?
If you're doing an ExternalInterface calls to JavaScript in the enclosing page, this may cause a security error; since the SWF from your domain shouldn't be able to access HTML content served from your client's domain.
I expect that is a fringe case though. Aside from that, what you're doing is not much different than what YouTube does. I've done the same thing with The Flex Show player. I don't think you'll have any issues. And I do not believe that this approach makes your app any less (or less) secure.

ASP.NET rdlc with external images not displaying images in PDF

I'm using the Microsoft ReportViewer that comes with ASP.NET and have a report parameter that should be setting the value (path) of an image in my report. I'm providing the path as a complete URL right now, starting with http:// but have also tried this as an app relative path, site rooted path, etc. and for some reason the image is always showing as the red X when it exports to PDF. I'm just creating an instance of a control in code, setting the properties and exporting directly to the response stream so it acts a download.
I'm just not sure what the problem could be with the image not showing up, so if anyone has any ideas please let me know.
UPDATE 1
I've determined that I can embed the image with a URL if it is on my public web server but when I'm running in localhost the image won't embed. I have confirmed for localhost that if I paste the same URL into my browser the image will open fine. As far as I know, I don't have a proxy. So I can work around my issue, but I still don't understand what the problem is with localhost.
UPDATE 2
Forgot to mention that when the URL to the image is opened from a browser it works fine.
It is not possible for a PDF to contain a reference an external image (at least from my understanding). In order for an image to appear in the PDF, it must be embedded into the document. Therefore, to use an external image, your app must retrieve the image and store it in the document. The report viewer will try to do this for you.
Two possible answers:
First, in order for your app to package the image into the PDF, it must be able to retrieve the image from the URL you are specifying. If that URL is behind a proxy (from the perspective of your app server) and/or requires credentials to access, this will present a challenge with the default configuration of the report viewer.
If a proxy server is the issue, please see the settings to your web.config you can add below. You may also need to supply network credentials, so your app can authenticate to the proxy. There are lots of ways to solve this, but one of the easiest is to run your application as a service account on your domain that has rights to traverse your proxy. You can test this by running the site as you temporarily (should be temporary because this is a horrible security practice).
The image you are using could require credentials to access (try pulling up the image in Firefox with empty cookies and verifying whether credentials were required to access it). If it requires Windows authentication, the same solution to proxy security may apply to authentication required on the remote image. If it requires some other form of authentication, you may be better off downloading and embedding the image into your project.
It is also possible to download the image using other means in your code and convert it to a byte array for inclusion in the report. There are lots of examples of this on the web, including a Stack Overflow here.
Second, take a look at the following page:
http://msdn.microsoft.com/en-us/library/ms251715%28VS.80%29.aspx
Using external images in a
ReportViewer report is not enabled by
default. To use an external image, you
must set the EnableExternalImages
property in your code. Depending on
your network configuration, you might
also need to bypass proxy settings to
allow the external image to appear.
You can add the following settings to
the Web.config file to bypass the
local proxy. When modifying your
Web.config file, be sure to specify
the name of the proxy server that is
used in your network:
<system.net>
<defaultProxy>
<proxy usesystemdefault = "false" bypassonlocal = "true" proxyaddress = "http://< proxyservername >:80/" />
<defaultProxy>
</system.net>
Hope one or both of these helps.
Jerry
When passing external image filenames to ReportViewer parameters, pass the format like this: file://C:\app\images\pic.jpg. Anything else usually doesn't work well when deployed.
Okay, so this was our solution. The web server did not recognize its own qualified DNS name as a URL, so we had to edit the Hosts file in the C:\Windows\System32\drivers\etc folder and add the host name as localhost. The line we added to the file was:
ourserver.ourdomain.com 127.0.0.1
I don't think Adobe Reader (or maybe the PDF specification itself?) allows external content to be loaded for security purposes. I vaguely remember having a similar issue that had nothing to do with reporting services (I was dynamically generating PDFs and using variable logos and had to embed them).
Did you try a regular file path (c:/temp/somefile.bmp)? Reporting services local report reads the file from the disk and embeds it in the pdf file produced. Make sure that the identity of the app pool in IIS has read permission on the image file.
We are doing it and our images are placed in an img folder under the web site, along withe the rest of the web sites images. We avoid hard coding the path by using Server.MapPath(relative path).
Hope this helps
I fixed my problem with this:
//For local relative paths
string imgUrl = new Uri(HttpContext.Current.Server.MapPath("~/images/mylocalimage.jpg")).AbsoluteUri;
// OR
// For complete URLs
{
ServicePointManager.SecurityProtocol |= SecurityProtocolType.Tls | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12; // This allows reportViewer to download image from url
string imgUrl = /* Your image URL ("http://") */;
}
//Then pass imgUrl parameter as external source of your image.
Can the report viewer get an image from a relative url? I've never used it, so best to check that assumption.
Have you tried using the Html.Content() helper to set the URL? Whenever I have issues with my urls its because I didn't use this to generate the correct url for the view.

Browse Files Server-side in ASP.NET

I'm creating an ASP.NET web application to schedule tasks on our server from a remote location using a .NET Wrapper for Scheduled Tasks. However, I'm stuck.
The user needs to be able to browse the file system on the server to retrieve a "file to run" for the new task that the user's creating in this application. I need to get the filepath/filename and pass it into the .NET wrapper.
I've tried using HTMLInputFile, but I haven't found a way to make that work for me.
Any help is appreciated.
Thanks
Update:
For this project, we've decided to simply list the executables in a dropdown box that would be available to users since they don't really need total access to the file system, just for security's sake.
HTMLInputFile is used to browse the client's file system and upload a file to the server. It isn't used to browse the server's file system.
You will need something quite different. You will need some server side code to display the server side folder structure to the user via the browser.
There is an example of a basic implementation of this here.
Update:
With that sample, the path that you replace "yourfolderHere" with needs to be a virtual path, rather than an absolute path. So for example "C:\Inetpub\wwwroot\uploads" won't work, but "uploads" will work.
I hope it goes without saying that there are serious security issues to think about when implementing something like this.
The HTMLInputFile will only work on the client-side machine.
You need to write a filesystem browser in ASPX/HTML that browses on the server-side.
Shouldn't be that hard to do.
You can't use the <input type="file" tag
This brings up a client-side dialog that browses the client machine.
As far as I am aware you need to create your own 'browser'.
eg You could use the My.Computer.Filesystem classes to retrieve a list of files in a folder and show those on the webpage. The user then selects the relevant file and posts a response back to the server.
You can use System.IO.Directory to get directories and files. These can be displayed in a number of ways. A simple browser / file selection should be possible in less than 50 lines of code.
Also be aware that you may need to grant extra permissions to the user that your web app runs as so the file system is accessible.
There are also various security implications around this, so don't grant access to everything unless you really need this.

Resources