HI am new to AMFPHP. am creating flex audio player.
Whenever am playing the song in my player. the song url will be displayed by the use of FIREBUG addons..
How can i encrypt and decrypt that url using AMFPHP or PHP.
Some flash audio players done this job using AMFPH..
Edit/Delete Message
You can't. Firebug's Net tab sees all HTTP[S] net traffic. If you want to stream a song to the browser without an HTTP URL being visible in Firebug, you would have to use a different protocol to HTTP — typically RTMP.
The way some sites protect HTTP streams is to use a one-time URL, so that the player generates an authentication token (typically using crypto hashing) and that can only be used to download the stream once; it is served with Cache-Control: no-cache header to stop the browser storing it on disc and making it available to the user for download from the Net tab. Defeating caching of course does mean that you'll serving a lot more data unnecessarily. And it's still pretty easy to circumvent.
Don't imagine you can solve the Copy Protection Problem. Even “protected” RTMPE is very much downloadable.
Related
I've been trying to figure this out for hours now. Consulting the official documentation It says I need to make a post request to https://www.googleapis.com/upload/youtube/v3/videos with a content type header set to video/* or application/octet-stream (I've used the latter). Turns out if I just post a buffer of a video file to that url it'll work. But the documentation also says I can specify a whole bunch of options about the video (title, description, tags, etc.) However, it says to attach that information to the request body! I'm confused on how I'm supposed to send both the video bytes and the options in the same request. Maybe it's not supposed to be the same request, but they don't mention anything about using multiple.
Uploading videos using Youtube API is done using a protocol that Google calls "Resumable Uploads Protocol". Google uses this protocol across their APIs (i.e. Drive, Youtube etc.) and is recommended in the following scenarios
Uploading large file
Unreliable network connection.
The full details of how to use "Resumable Uploads Protocol" with the Youtube API can be found at https://developers.google.com/youtube/v3/guides/using_resumable_upload_protocol.
The following is a simplified set of steps:
Create a resumable upload session by sending a POST request to the insert API endpoint.
Read the resumable session URI from the Location header of the above request.
Upload the video by sending a PUT request with binary video data as body to the resumable session URI.
is it possible to embed an m3u8 stream?
i tried several players and stuff but it just don't play, is it something about crossdomain?
here is what i got
https://html.house/f16jzx1y.html
You must configure CORS server side. If you do not control the server, then it’s not possible to embedded the stream in another domain. That is exactly the purpose of CORS.
I have an web application in which I generate a download link to an external google resource. This request usually needs a cookie. Because of the cross domain policy I currently download the files with curl and then pass them through to the user. Now those files are large. So I was looking for a way to download them directly through the clients browser.
Playing around I've found out that I can append the cookie in question to the http query, but this only works if no other cookies are set! Since it's google almost all users will have set some cookies for .google.com. Is there any way (maybe some security feature or bug) I can trigger a download request for that file in the users browser without sending any cookies along.
I discovered that I can make a request to *.google.com. (notice the . at the end) and then most browsers won't send any cookies set for .google.com . I've did a quick test using browsershots and on my own devices. The hack works in almost all browsers except for Safari (desktop and mobile) and some no name browsers.
While this works, I've decided not to use that method because the file name will be set to something unusable (no file extension).
I want to upload files with JSF. I want to be able to resume an upload after a pause (voluntarily or not).
They will be about 500Mb in size.
I'm working with PrimeFaces, which has a neat FileUpload-Tag, but it doesn't let me pause/resume uploads.
I did some research on this. The most common answer is "Use an FTP-Client". Others were Java-Applets or Flash.
It should work on the current Firefox/Chrome and IE8.
It's indeed not possible to resume file uploads via HTML <input type="file"> element. There's namely nothing in the multipart/form-data encoding which would ever support that. Even more, there's no standard form encoding specification which supports that. You'd basically need to invent a custom HTTP request format.
In Java terms, your best bet is to homebrew an applet or webstart application for this which uses Swing JFileChooser to pick files and uses URLConnection to send it via HTTP to the server. In the server side, you'd need a custom Servlet which understands the custom request format and processes the partial upload accordingly.
There's a 3rd party applet which is capable of this all: JumpLoader. Its homepage is at the moment unfortunately down (you could however try Google Cache). To the point, it sends some specific HTTP request parameters along with the multipart/form-data upload request telling the server side if it's a partial upload and if so at which index it should start, so that the servlet can glue the pieces together.
Then, to integrate this all with JSF, your best bet is to let the applet pass the session ID around as URL path fragment so that the servlet shares the same HTTP session as the JSF application. This way the upload servlet can access session scoped JSF managed beans and/or the JSF application can poll for some servlet-specific variables in the HTTP session.
I read about "HTTP persistent connection" but somehow I don't seem to understand what does persistent mean in this context.
Could you'll elaborate?
It means the server doesn't close the socket once it's finished pushing out the response (so the length of the response has to be otherwise indicated, via headers or chunking), so the client can make other requests on the same socket. A web page often requests several other pieces (images, CSS, scripts, ...) on the same server as the page itself, so reusing the socket for some of those further requests to the same server can reduce overall latency compared to closing the original socket and opening new ones for all the follow-on requests.
All the discussion till now has been from the browser side of things. The browser first request the actual page, and it parses the page and finds out all other resources that it needs before it can render that page. The browser requests these resources and other dependent resources one by one. So maintaining a persistent connection is very efficient here, as the overhead of creating and destroying connections is avoided.
Now from web server side of things, a persistent connection would be one that allows it to "push" content to the web browser. Now HTTP doesn't support this. So, there are few workarounds with javascript where the page is basically refreshed after a while.
You can see this being trick being used by many web based email providers which continuously keep checking in the background for new mails. This gives a feeling that when a new mails arrives, the server "pushes" the new mail notification to the web browser. But in fact, its actually the web browser which keeps on checking the server for any new mail.
Also another point that I would like to state is that we actually don't see any page refresh that's because of another trick which allows only specific parts of the page to be refreshed by the request. (HINT: AJAX)
I think this is a switching for http or https for website browser. If you have old https:// and you are now using http for browser .htaccess file then this problem should created via yoast plugins one page crawl page. don't worry about it is not important error. For hackers this is a way to hack your website if your ssl connection is empty they should attach there page or domain to your ssl connection
e.b http://www.example.com and when you brows https://www.example.com in browser there are some other link with open your site domain.
Solution for this always use your full address for website: to protect hackers against your website use ssl and https:/ page for your website.
Then this problem have never scene in any test site or page.