I am building a very simple Meteor app with the following situation: there are some static files in the backend (images, audio, video) that I need to include in my page, but I only want authenticated users to be able to access them. How can I do this?
I suspect I might be able to do this using routers but I haven't managed to get that running using the official documentation (or any other resource for that matter). If somebody could point me in the right direction I would be thankful.
It depends on how secure you needed.
You may want to just hide/show the link then you can put those files in public folder and add some logic to the front-end to show/hide the files based on user's authentication state.
If you need it more secure then you you need to put those files behide APIs. You can use DDP apis (Meteor method or pub/sub) or you may use webapp to create api.
There are mainly 2 ways to do it:
obfuscate source urls
Only show to signed in users.
a) meteor-files: https://github.com/veliovgroup/Meteor-Files for overall file security.
b) CDN signed urls: root url remains the same but it requires a signature for every download/view. You can look at things like Instagram, FB etc.
Put images in pages on routes for only logged in users. If delivered from CDN, add policies to match your referrer/origin to disallow embedding of your images on other websites.
Related
This has been asked in similar forms here and here but it seems pretty important, and the framework is under rapid development, so I'm going to raise it again:
Assuming your login page needs to face the public internet, how do you prevent Meteor from sending all of the authenticated user templates to a non-authenticated client?
Example use case: You have some really unique analytics / performance indicators that you want to keep secret. You've built templates to visualize each one. Simply by visiting the login page, Meteor will send any rando the templates which, even unpopulated, disclose a ton of proprietary information.
I've seen two suggestions:
Break admin into a separate app. This doesn't address the issue assuming admin login faces the public internet, unless I'm missing something.
Put the templates in the public folder or equivalent and load them dynamically. This doesn't help either, since the file names will be visible from other templates which will be sent to the client.
The only thing I can think of is to store the template strings in the server folder and have the client call a Meteor.method after login to retrieve and render them. And if you want them to behave like normal client templates, you'd have to muck around with the internal API (e.g., Meteor._def_template).
Is there any more elegant way to do this?
I asked a similar question here:
Segmented Meteor App(s) - loading only half the client or two apps sharing a database
Seems to be a common concern, and I certainly think it's something that should be addressed sometime.
Until then, I'm planning on making a smaller "public" app and sharing the DB with an admin app (possibly in Meteor, possibly in something else, depending on size/data for my admin)
These 2 packages try to address this issue:
https://atmospherejs.com/numtel/publicsources
https://atmospherejs.com/numtel/privatesources
It uses an iron-router plug-in to load your specific files on every route.
The main drawback I see here is that you must change your app structure, as the protected files need to be stored in /public or /private folder.
Also you are supposed to use iron-router.
I wanted to know that, is there some special requirement for a website to make use of CDN ?
i mean is there some special scheme(or atleast considerations) on which your website must be build right from the start to make use of CDN (Content delivery network).
is there anything that can stop a website from making use of CDN, for example the way it references the content files, static file paths or any other thing conceivable.
Thanks
It depends.
You have two kinds of CDN services:
Services like AWS Cloudfront that require you to upload the files in some special place that they read from (eg. AWS S3) - In this case you need have a step in your build process to correctly upload the files and handle the addresses somehow inside your application
Services like Akamai that just need you to change and tweak your DNS records so they will serve the request to your users instead of you - In this case you would have two domains (image.you.com and image2.you.com) and have the image.you.com pointing to Akamai and image2.you.com pointing to the original source of the file. Whenever a user requested an image in Akamai, they would come to you through the "back door", fetch it and starting serving that file always.
If you use the second approach it's really simple to have a CDN supporting your application.
There are a whole bunch of concerns when dealing with CDN solutions.
The first one is that a CDN can't serve a dynamic page - i.e. a page that is unique to every user. Typically, that includes PHP, ASPX, JSP, RubyOnRails etc. - so if you're hoping to support lots of users for a dynamic site, you have to come up with another solution. Some CDN providers support "Edge Side Includes" - this allows you to glue dynamic pages together with cached content on the CDN, but this creates quite a complex application.
Of course, even on a dynamic application, a CDN can still serve static files - images, stylesheets, javascript files, videos etc.
#Tucaz explains the two major options here (actually, Akamai also provides a "filestore" CDN option). If you select the second option - effectively, the CDN becomes a caching reverse proxy in front of your website - it makes sense to tweak the cache headers on your HTTP server, and tell the CDN to honour those. Make sure you set your .ASPX files to not cache!
I made an ASP.NET MVC application which allows user to create dynamic websites. I need to add feature which will allow to download from server off-line version of choosen website as static html files with menu, hyperlinks, images, documents etc. It should work similar to applications such as Teleport Pro, but I have to choose from Admin Panel which content should be export.
Client wants to burn static website on CD, save on pendrive.
Do you have any ideas how to begin? Please help.
I currently have implemented that in a current project...
User is able to change anything in the frontend and at the end he can publish and download the offline files... the site subscribe users and show all prizes, winners and more information about that campaign.
All was done in ASP.NET MVC3 under .NET4 and hosted in AppHarbor.
It's composed at several applications but for what you want, you develop the Backend and the Frontend, and to generate the static files, simple use the Frontend to grab the full HTML
As an example, I can show what 2 users did...
Callme.dk did http://callme.julekal.info and
Sony Nordic did http://sony.julekal.info
plus, you can simply point custom domains to it as well like http://sonynordicxmas.net/
To publish and generate all files:
one part of the editing:
So I give the users, offline access (through the .zip file), online access (through the frontend application) and the ability of using custom domains...
I think the only way this might be possible is if you go to every single page and then use your browser to "Save" the web page script and all.
However this causes several issues;
You never quite get everything and you need to massage the HTML produced, dowload all the images etc to get the page to look right
Each html file now has an associated folder with the same name and each time you do this you will get another html file with a folder. You can combine all the folders into a single one but that leads me to item 3.
You will need to edit each html file to clear up any pathing issues if you want to share a single source folder.
Data is no longer dynamic!
You need to, if you want to link all the pages to each other, edit every single html file and resolver the anchor tags.
This is too much work and I think it actually breaks the true requirement.
Don't do it! :)
Is there a way to programmatically set the name of a file to be uploaded from a web page? I suspect that browser security restrictions make this impossible, but I'm hoping someone will prove me wrong.
I have a web application that needs to let the administrator upload HTML. The admin selects the HTML file, then the app uploads that file, plus figures out all the supporting files (images, stylesheet, etc) and uploads them too. There doesn't seem to be a way to programmatically upload the supporting files from a web page, since the user has to specify each file explicitly.
Currently I have a separate Windows app to do this, but it would be ideal to have this functionality integrated with the rest of the app. My back end is ASP.NET with C#.
There is no way to programatically grab files from a user's computer via the browser. This would be a security violation if a website could just grab things.
Yes you can (in modern browsers)...
You can get and set the value of HTMLInputElement.files.
See this answer.
No, you cannot do this without a client-side application or special plug-in.
Browser security doesn't allow the server to obtain information about the hard drive contents of the client.
You may be able to do this using some form of browser plug-in. This is more work for you (and there are potential security implications for this beyond those found when you just have users run your app). However, it may prevent a more integrated experience for your users. I'd hesitate to eliminate the application completely, though. Browser compatibility issues are common.
Imagine you have a large number of video files stored on a server, and a Flex app which lets users play those videos they have access to. How can you best set this up? Wouldn't the Flex app just be sent the name of the video to play... in which case couldn't someone else write another flex app if they knew the file names? Can Flex play videos hosted on other sites? Is there some clever piece on the server I'm missing, which sits between the Flex video player and the files?
Ask users to log-in with a username and password - you can use OpenID if you want.
Update:
Set the crossdomain.xml of your server in such a way that only Flash movies from your domain can access content from there.
You can write a server-side script (in PHP or something else) that serves the file only if your user is allowed to see it (how to determine that you'll have to come up with yourself). This is a bit of a performance hit, although not so much if you use PHP's readfile().
Can Flex play videos hosted on other sites?
Your crossdomain.xml can control this. e.g. myvideoserver.com/crossdomain.xml would contain entries based on who you want to grant access to, like myflexserver.com. Then just ripping off your main flex application wouldn't give them access to your video files.