I can't play my recordings when my recorded streams are pushed to S3 - ant-media-server

I can't play my recordings when my recorded streams are pushed to S3. How can I fix this issue? How can I be able to watch when I'm sending the stream to the S3.

This is caused because Ant Media is trying to watch Vods on your machine while recordings are somewhere else.
You can achieve playing these by http forwarding.
I assume that you already added your S3 account to Ant Media Server since you are recording to that destination
Open the file {AMS-DIR} / webapps / {APPLICATION} / WEB-INF / red5-web.properties
Add comma separated file extensions like this settings.httpforwarding.extension=mp4 to the file
If you want to save your preview images, add png as well.
Add the base URL with settings.httpforwarding.baseURL=https://{s3BucketName}.s3.{awsLocation}.amazonaws.com for forwarding.
Please replace {s3BucketName} with your own URL and {awsLocation}. Please pay attention that there is no leading or trailing white spaces.
If you did correctly the steps above, your vod playing request will be like following: https://{s3BucketName}.s3.{awsLocation}.amazonaws.com/streams/{streamId.mp4}

Related

Nginx Reversed Proxy Requesting Files Above Proxied Folder

I am using NGINX Proxy Manager with a Custom Location "/setup" which redirects to a reolink IP camera on ip address 192.168.1.50.
The problem is when I open a browser and connect to the target hostname i.e.
http://somedomain.com/setup
What happens, is the reolink camera host attempts to serve javascript, css etc files from itself, at the level above /setup, i.e. the root of the host and these files of course don't exist there.
For example (there are lots of files, this is just one) the host is trying to get it's css files from here:
https://somedomain.com/css/glDatePicker.default.css?timeVersion=1603795049091
When in fact it should be getting them from here: https://somedomain.com/setup/css/glDatePicker.default.css?timeVersion=1603795049091
Any suggestion on how this can be resolved ?
One way you could resolve this is by bundling everything into one HTML file. You can use a tool like webpack to do this, but webpack certainly isn't the only one. Here is another SO question about how to do this with webpack. Honestly just googling "bundle html, css, and js into one file" will bring up some good results too.

How can I let user select the location for downloading an Export to Excel file in asp.net webforms?

I'm exporting an excel file that's created dynamically at run time from DataTables in aspx.cs file at server side using ClosedXML lib. I want to let user select the downloading file location at client side, which is currently moved to downloads.
You unfortantly cannot do this. This is also why you can never select a local file or location from server side.
So, the users local file system is 100% off limits.
And the reason is quite simple. If you come to my web site to look at some cute cat picture? Well, while you are doing that, do you think it would be ok if my web code also starts looking around at your files? Hum, maybe a file called banking? Maybe a filel called passwords? Hum, how about I steal all your email files? How about I look for a Excel sheet called passwords?
So, when it comes to poking around, looking at, and deciding things like file locations? You cannot on the server side GET ANY information, nor can you even find and define what file to pick for up-loading, and the SAME applies to downloading of files. If I could pick a location, then gee, why don't I start overwriting some of your system files - including some that would give me remote access to your computer, right?
So, things like what folder, what file, even the computer name etc? These things are 100% hidden, off limits and simple not allowed. Now it would be possible for someone to come out with a new web browser tht allowed local file rights and access. But then again, no one in their right minds would ever use such a browser, and the security hole would be too large. As a result, for reasons of security, such information, and even simple knowledge of the local file system is not allowed, nor even exposed to the web server.
But then again, the user might be on a iPad, or android phone, and their file systems and how their folders works is not even the same as say a windows desktop computer anyway.
However, you can see with above, that your ability or even options to mess with, or even choose local file locations is not allowed for reasons of security.
So, if you web site provides a file, or even streams down a file, it will go into the download folder as per user browser settings - you unfortantly can't change this - it works that way due to security concerns.

Rooted Path and FileUpload Control

I know it's been asked and I have read the posts and Googled this all day. Still nowhere near something that works. Using an .aspx page, I need to upload a .pdf file to a specific website. I'm doing development using VS2017 and VB.Net. The app will run on different websites. It needs to upload client files to a specific different website and path. Also, the file name of the uploaded file will not be the same as the local source file. Creating the new name is no problem.
Let's say a local file must be uploaded to a website at https://www.appfileserver.co.za/pdfdocs, but I'm on https://www.myownsite.com. So, when using FileUpload1.SaveAs(rootedpath) the path that goes in there must be the rooted path to the target. What would the rooted path look like for the example I provided?
FYI, I know the IP addresses, http paths and anything else I need to know because I control those sites. It would be great to do an FTP upload. I have done this many times from desktop apps. Unfortunately I'd need the full path to the local file. It seems there is no way a web page is allowed to get that full path, so FTP upload is out - or is there a way?
After battling for two days trying to FTP upload from website to website (which is not possible because server firewalls block this), I finally solved it. The solution was a simple one. I deployed the upload .aspx file on the target server then embedded that in an iframe on the client machine apps. The files are then uploaded one time to the right place. Simple and 100% effective. Hopefully somebody see's this and understands it - so as to avoid the troubles I had.

Is there an open source solution to setup my own "CDN"?

I currently have 1 dedicated server with 5TB of static content (mainly videos), It was holding up nicely with a small number of users but It seems like it can't keep up anymore.
I know a CDN is supposed to be in multiple locations but all I want is to setup more servers that sync files to the origin server and serve those files along with it.

Make files public from Google Compute Engine

I'm running RStudio server on an instance of Google Compute Engine. My RScript creates a map file that I would like to include in a public web site.
The file gets created OK.
Separately, I've also created a bucket and can upload images to it, viewing them from a web browser with a URL like this: https://storage.googleapis.com/...
Still, I'm confused as to how to make the image created by the R script viewable by a browser. Does the image have to find its way over to a bucket? Or is it viewable where it is somehow?
There are infinite possible solutions depending on what you want to implement and how much time you want to spend on it (and if you are the only one accessing or not and if you can share the file or they are sensible), therefore I will provide you some hints:
The easiest one is to upload the file to a Google Storage Bucket, then you can control who can access that link (a single user, a domain or everyone), it could be access by accessing with the browser with the following link:
https://storage.googleapis.com/namebucket/folder1/folder2/nome_file
There is no graphical interface, you will need to know the address to download the file (at the end it is enough to know the name). You will need to create a small script to make sure every time a image is available to upload it to the bucket and to make it public available. Or you can decide to make he bucket itself public.
The second possible solution is to do the same but to create an html page REALLY simple, basically a list of links to the files in the bucket, each time you upload a file to the bucket you update the html file. At least you would solve the issue regarding the knowing the names and you can navigate it a bit.
<html><body>
This is a link
</body></html>
If you need to expose the resources to more people, or you would like to have something more "nice" graphically you will have to spend more time and build a decent frontend. You can follow thousands of different approaches.
You have really thousands of possibilities.
P.S.
Documentation regarding uploading a file to bucket.
Documentation regarding managing access to file stored.
Notice that in this way depending on the extension of the file you want to share the browser behaves differently, a .txt, a .jpg are shown an .exe is downloaded.

Resources