I have a swf that I have created that make a few HTML posts. When I run and compile the swf locally, it successfully makes posts to my php code hosted on my domain. However when I then upload my swf to my domain, and then alter the embed tag to have a fully qualified path to my hosted swf, the swf will load correctly but it will make any HTML posts to my PHP scripts. The reason that I have an an embed tag with the fully qualified url in it, is that my goal here is to be able to place the html embed code on a number of different sites.
I have it working on a single remote site, and it has a wild card crossdomain.xml file in it. However when I try to apply a crossdomain.xml to any of the hosted sites, or to my computer locally nothing runs when the I use the embed with the fully qualified URL, if I just the locally hosted swf on my computer the HTML post work just fine.
I feel this is related to the crossdomain.xml file, however I guess I'm not understanding some aspect of the security model.
So my goal is to be able to paste the embed HTML code with a fully qualified swf (for example, "http://www.abc.com/myswf.swf") on a number of other sites and have it make standard HTML posts to my home site (http://home.com).
Thanks for any help on this one. I'm soooooo close, like I said I got it working on one remote site, but I'm not sure how. All other sites I post the embed code fail.
Ok , I'm editing my answer , following the examples you've just given.
According to the error report I get , the problem is with an ExternalInterface call that's not allowed from the swf to the site hosting your swf.
It works in the second example because the call is made within the same sandbox.
Taken from the docs:
SecurityError — The containing environment belongs to a security sandbox
to which the calling code does not have access. To fix this problem,
follow these steps:
In the object tag for the SWF file in the containing HTML page,
set the following parameter:
<param name="allowScriptAccess" value="always" />
In the SWF file, add the following ActionScript:
flash.system.Security.allowDomain(sourceDomain)
You may also have to check the allowNetworking parameter in your embed code...
Related
I made an ASP.NET MVC application which allows user to create dynamic websites. I need to add feature which will allow to download from server off-line version of choosen website as static html files with menu, hyperlinks, images, documents etc. It should work similar to applications such as Teleport Pro, but I have to choose from Admin Panel which content should be export.
Client wants to burn static website on CD, save on pendrive.
Do you have any ideas how to begin? Please help.
I currently have implemented that in a current project...
User is able to change anything in the frontend and at the end he can publish and download the offline files... the site subscribe users and show all prizes, winners and more information about that campaign.
All was done in ASP.NET MVC3 under .NET4 and hosted in AppHarbor.
It's composed at several applications but for what you want, you develop the Backend and the Frontend, and to generate the static files, simple use the Frontend to grab the full HTML
As an example, I can show what 2 users did...
Callme.dk did http://callme.julekal.info and
Sony Nordic did http://sony.julekal.info
plus, you can simply point custom domains to it as well like http://sonynordicxmas.net/
To publish and generate all files:
one part of the editing:
So I give the users, offline access (through the .zip file), online access (through the frontend application) and the ability of using custom domains...
I think the only way this might be possible is if you go to every single page and then use your browser to "Save" the web page script and all.
However this causes several issues;
You never quite get everything and you need to massage the HTML produced, dowload all the images etc to get the page to look right
Each html file now has an associated folder with the same name and each time you do this you will get another html file with a folder. You can combine all the folders into a single one but that leads me to item 3.
You will need to edit each html file to clear up any pathing issues if you want to share a single source folder.
Data is no longer dynamic!
You need to, if you want to link all the pages to each other, edit every single html file and resolver the anchor tags.
This is too much work and I think it actually breaks the true requirement.
Don't do it! :)
I've read through some of the questions here and my understanding is that this is true. Could someone confirm that visitors to an ASP.NET website can actually download the aspx files in their original format? Just like with the css files, etc. Thanks.
Clarification: Please be patient with me. I am newbie and just want to make sure I understand. I know that using Dreamweaver, a person can just download almost all the source files from a website. At least that what could be done some years ago with many websites. He would just change a few text contents and have a similar website like the original with all the original design, images, etc.
So if he can do the same with an asp.net site: downloading all the files, he can look at the aspx file and see what the code does. I am not talking about him executing the page and do the view source command. This file would naturally be processed by the server and doesn't expose source code.
This is one of the reasons why code behind is recommended because the code can be compiled and the source is not uploaded to the site. Only the dll is uploaded and minimum logic is exposed through the aspx file.
No, they can't. The ASPX page contains server-side code that is executed, well, by the server, and ends up containing plain HTML that the client browser can understand.
When IIS receives a GET request for an ASPX page, the ASP.NET handler kicks in and returns the processed HTML. So unless IIS is misconfigured, that is not possible.
No. Visitors cannot see your business logic.
If that were the case the markup asp:TextBox wont get rendered as input type='text'
Also, if that were the case we would be seeing code snippets of sites written using scripting languages like PHP or Classic ASP
in newbie's term:
No, the server won't give you ASPX and code behind files, these are files that don't mean anything to the end-user/visitor/browsers. These codes are processed on the server, and what you get is only a bunch of HTML code, javascripts, css, images, etc. which browsers can render.
If you try to "download" (by accessing them through your browser) .ASPX, .CS, and WEB.CONFIG files to see the actual source code, well you simply can't.
I'm using the Microsoft ReportViewer that comes with ASP.NET and have a report parameter that should be setting the value (path) of an image in my report. I'm providing the path as a complete URL right now, starting with http:// but have also tried this as an app relative path, site rooted path, etc. and for some reason the image is always showing as the red X when it exports to PDF. I'm just creating an instance of a control in code, setting the properties and exporting directly to the response stream so it acts a download.
I'm just not sure what the problem could be with the image not showing up, so if anyone has any ideas please let me know.
UPDATE 1
I've determined that I can embed the image with a URL if it is on my public web server but when I'm running in localhost the image won't embed. I have confirmed for localhost that if I paste the same URL into my browser the image will open fine. As far as I know, I don't have a proxy. So I can work around my issue, but I still don't understand what the problem is with localhost.
UPDATE 2
Forgot to mention that when the URL to the image is opened from a browser it works fine.
It is not possible for a PDF to contain a reference an external image (at least from my understanding). In order for an image to appear in the PDF, it must be embedded into the document. Therefore, to use an external image, your app must retrieve the image and store it in the document. The report viewer will try to do this for you.
Two possible answers:
First, in order for your app to package the image into the PDF, it must be able to retrieve the image from the URL you are specifying. If that URL is behind a proxy (from the perspective of your app server) and/or requires credentials to access, this will present a challenge with the default configuration of the report viewer.
If a proxy server is the issue, please see the settings to your web.config you can add below. You may also need to supply network credentials, so your app can authenticate to the proxy. There are lots of ways to solve this, but one of the easiest is to run your application as a service account on your domain that has rights to traverse your proxy. You can test this by running the site as you temporarily (should be temporary because this is a horrible security practice).
The image you are using could require credentials to access (try pulling up the image in Firefox with empty cookies and verifying whether credentials were required to access it). If it requires Windows authentication, the same solution to proxy security may apply to authentication required on the remote image. If it requires some other form of authentication, you may be better off downloading and embedding the image into your project.
It is also possible to download the image using other means in your code and convert it to a byte array for inclusion in the report. There are lots of examples of this on the web, including a Stack Overflow here.
Second, take a look at the following page:
http://msdn.microsoft.com/en-us/library/ms251715%28VS.80%29.aspx
Using external images in a
ReportViewer report is not enabled by
default. To use an external image, you
must set the EnableExternalImages
property in your code. Depending on
your network configuration, you might
also need to bypass proxy settings to
allow the external image to appear.
You can add the following settings to
the Web.config file to bypass the
local proxy. When modifying your
Web.config file, be sure to specify
the name of the proxy server that is
used in your network:
<system.net>
<defaultProxy>
<proxy usesystemdefault = "false" bypassonlocal = "true" proxyaddress = "http://< proxyservername >:80/" />
<defaultProxy>
</system.net>
Hope one or both of these helps.
Jerry
When passing external image filenames to ReportViewer parameters, pass the format like this: file://C:\app\images\pic.jpg. Anything else usually doesn't work well when deployed.
Okay, so this was our solution. The web server did not recognize its own qualified DNS name as a URL, so we had to edit the Hosts file in the C:\Windows\System32\drivers\etc folder and add the host name as localhost. The line we added to the file was:
ourserver.ourdomain.com 127.0.0.1
I don't think Adobe Reader (or maybe the PDF specification itself?) allows external content to be loaded for security purposes. I vaguely remember having a similar issue that had nothing to do with reporting services (I was dynamically generating PDFs and using variable logos and had to embed them).
Did you try a regular file path (c:/temp/somefile.bmp)? Reporting services local report reads the file from the disk and embeds it in the pdf file produced. Make sure that the identity of the app pool in IIS has read permission on the image file.
We are doing it and our images are placed in an img folder under the web site, along withe the rest of the web sites images. We avoid hard coding the path by using Server.MapPath(relative path).
Hope this helps
I fixed my problem with this:
//For local relative paths
string imgUrl = new Uri(HttpContext.Current.Server.MapPath("~/images/mylocalimage.jpg")).AbsoluteUri;
// OR
// For complete URLs
{
ServicePointManager.SecurityProtocol |= SecurityProtocolType.Tls | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12; // This allows reportViewer to download image from url
string imgUrl = /* Your image URL ("http://") */;
}
//Then pass imgUrl parameter as external source of your image.
Can the report viewer get an image from a relative url? I've never used it, so best to check that assumption.
Have you tried using the Html.Content() helper to set the URL? Whenever I have issues with my urls its because I didn't use this to generate the correct url for the view.
I have an HttpModule that displays images that follow a certain URL pattern. For example, /images/employees/jason.jpg is handled by the module, but all other images aren't. It works just fine on my local machine (Cassini and IIS 7). However, the IIS6 production server isn't working. I've had the hosting company map the images to the ASP.NET worker process. Now, all images are showing that they can't render except for the images that should be rendered by the module. They are working correct.
I ran an HttpWatch instance on one of the files and received the following error:
ERROR_HTTP_INVALID_SERVER_RESPONSE
Any ideas?
Final Answer:
The module needed to be updated to transmit server files. So, I added an else to my original if and checked to see if it was an image type (by using a utility method) then use Response.TransmitFile() to pass on the file to the browser.
I then ran into a spacing issue with the images. This was because I forgot that I had .aspx files registered as an image type to perform the testing. So each page would crash during the debug process or add padding that was established from CSS. Doh!
Everything is just peachy now. Thanks to all!
There's doesn't seem to be anything particularly wrong with your module, so the issue must be coming from somewhere else. Have you got security that might be blocking the images? What actually gets returned when you request a static file?
I'd suggest seeing what gets returned (and its headers) using something like firebug to check things like the response code, content type, the actual raw response, etc...
check your web.config IIS6 / IIS7 have different places to add modules and depends on what mode your IIS7 is running in.
http://arcware.net/use-a-single-web-config-for-iis6-and-iis7
Let's say we have a web site with a CF app that was written in-house.
Assume that:
Server 2003 IIS6 or 2008 IIS7 will be used
ColdFusion 8 will be used
Directory browsing is denied
SSL is required to connect
The account login process is secure (yeah I know that is a whole other
ball of wax but that concept is discussed ad nauseum on the web).
Say I have a file at https://domain.com/folder1/folder2/ with a name like picture92352.ext imagine it as a jpg or pdf or whatever. The entire path between the domain name and the file varies widely in naming structure, depth, etc. Files are not all lumped together in one folder.
The app restricts links by user such that a user would have to have access to that file to find it in the first place but as it stands now if a person knew the full URL to that file they could retrieve it without logging in to the app. It's the classic security by obscurity situation. A random person isn't likely to find a file they shouldn't get to but once someone is given access they know how to access it from another PC where their actions might not be traced back to them.
How do I restrict access to these files before someone logs in and still make them accessible to outside users after they log in? Is there a way to do it with permissions only or is the only answer to have code dynamically moving files around at the time of the request or is there some obvious step I'm not even thinking of?
Let me clarify this slightly. No matter how the file is presented on a page a user can use the browser IE, Firefox, etc to examine the URL the file comes from. If the image is a link there is always copy shortcut in the right click menu for IE and the same functionality in FF is called copy link location. If the image is displayed inline as part of the page an IE user can right click and choose properties to see the URL, in FF the same functionality is present to see properties but there is an even quicker more convenient option labeled copy image location. Once a user knows the URL to a file if the location or file name doesn't change they can use that URL without authenticating in the CF app.
If I change the NTFS/share permissions so that IUSR can't see the content then my CF app and IIS can't push it. What strategy do I use to provide the file in the CF app that doesn't leave this hole open?
You could write a CFM page that serves up the images. Then you just make sure they are authenticated inside the CFM.
<!-- something like this -->
http://localhost/GetFile.cfm?file=foobar.jpg
In GetFile.cfm, you would do something like:
<!-- the filename part is what the browser will pre-popualate the file name in the download dialog as -->
<CFHEADER name="Content-disposition" value="attachment;filename=picture92352.ext">
<CFCONTENT type="text/plain" file="\\fileserver\folder1\folder2\picture92352.ext">
Take a look at the various MIME types.
If you wanted to do something similar but keep a more natural URL, I think you would need to leverage the Java servlet underpinnings of ColdFusion to create a handler for any URL matching a certain pattern.