I have:
public TextAsset ttt;
void OnGUI(){
GUI.TextArea(new Rect(600,10,350,300),ttt.text,style_text);
}
I already have hosting anf file txt http://host.com/fff.txt
My question is: how can I read this file online and put in TextArea?
You can use WWW to do so
public string GetHTML(string uri){
WWW www = new WWW (uri);
while (!www.isDone) //wait until www isdone
;
if (www.error != null)
return null;
return www.text;}
Above is simple code for you,for more, see also http://docs.unity3d.com/Documentation/ScriptReference/WWW.html
In fact you can use WWW to get text/texture/audio from web/filepath, I'm sure you can handle after read that manual :)
TextArea: http://docs.unity3d.com/Documentation/ScriptReference/GUI.TextArea.html
As you want display a text, AKA UI, I suggest you leave Unity GUI alone. Generally, for now, if you want to develop quality games quickly, Unity GUI system ain't your best choice, it's waste time somewhat. You may take a look to NGUI(http://www.tasharen.com/?page_id=140), most people use it(or maybe iGUI, fastGUI etc) on products, not that native one.
With NGUI, you can create a UILabel widget/component, and call UILabel.text = "SomeString".
Related
I'm working on an app that has video streaming functionality. I'm using firebase database and firebase storage. I'm trying to find some documentation on how firebase storage handles video files, but can't really find much.
There's mentioning in the docs that firebase storage works with other google app services to allow for CDN and video streaming, but all searches seem to lead to a dead end. Any advice?
I think there are several types of video streaming, which could change our answer here:
Live streaming (subscribers are watching as an event happens)
Youtube style (post a video and end users watch at their convenience)
Having built a live streaming Periscope style app using Firebase Storage and the Firebase Realtime Database, I pretty strongly recommend against it--we uploaded three second chunks and synced them via the Realtime Database. While it worked (surprisingly well), there was ~5 second latency over very good internet, and it also wasn't the most efficient solution (after all, you're uploading and storing that video, plus there wasn't any transcoding). I recommend using some WebRTC style, built for video transport, and using the Realtime Database for signaling along side the stream.
On the other side, it's definitely possible to build mobile YT on Firebase features. The trick here is going to be transcoding the video (using something like Zencoder or Bitmovin, more here: https://cloud.google.com/solutions/media/) to chop up your uploaded video into smaller chunks of different resolutions (and different formats, iOS requires HLS for streaming, for instance). You client can store chunk information in the Realtime Database (chunk name, resolutions available, number of chunks), and can download said chunks from Storage as the video progresses.
If you want to steam a video from Firebase Storage, this is the best way I found. This will depend on the size of your video file. I'm only requesting 10-30mb files so this solution works good for me. Just treat the Firebase Url as a regular url:
String str = "fire_base_video_URL";
Uri uri = Uri.parse(str);
videoViewLandscape.setVideoURI(uri);
progressBarLandScape.setVisibility(View.VISIBLE);
videoViewLandscape.requestFocus();
videoViewLandscape.start();
If you want to loop the video:
videoViewLandscape.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mp.setLooping(true);
}
});
And if you want to show a progress bar before the video starts do this:
videoViewLandscape.setOnInfoListener(new MediaPlayer.OnInfoListener() {
#Override
public boolean onInfo(MediaPlayer mp, int what, int extra) {
if (what == MediaPlayer.MEDIA_INFO_BUFFERING_END) {
progressBarLandScape.setVisibility(View.GONE);
return true;
}
else if(what == MediaPlayer.MEDIA_INFO_BUFFERING_START){
progressBarLandScape.setVisibility(View.VISIBLE);
return true;
}
return false;
}
});
This is not the best way of doing things but it works for me for now until I can find a good video streaming service.
2020: Yes, firebase storage video streaming is easy and possible.
All other questions suggest that you use a protocol like HLS. However, this is only necessary if you develop an app for the Apple AppStore that serves videos that are longer than 10 minutes.
In all other cases, you can simply encode your videos in mp4 and upload them to firebase. Your clients can then stream the mp4 without a problem. Just make sure that your moov atom is at the beginning of your mp4 file. This allows to start playing the video immediately, even if it is not fully loaded.
Users can also skip ahead or go back thanks to variable bit requests which are supported by firebase storage.
To test it, just upload a video to your firebase storage and open it in your browser.
You can host HLS videos on Firebase Cloud Storage. It works pretty well for me.
The trick is to modify the playlist .m3u8 files to contain the storage folder prefix, and the ?alt=media suffix for each file entry in the playlist:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:3
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:2.760000,
<folder_name>%2F1_fileSequence_0.ts?alt=media
#EXT-X-ENDLIST
You also don't really have to use server-side transcoding, you can have the client who uploads the video do it, and save considerable costs.
I've written a full tutorial with source code here: https://itnext.io/how-to-make-a-serverless-flutter-video-sharing-app-with-firebase-storage-including-hls-and-411e4fff68fa
this is my exact implementation for it to start a video playing from storage on firebase as soon as the view is open, and then have the view disappear and then addded a button to click after to replay the video.
I have a demo link with the key so you can see it works. any questions hit me up.
you will just have to create a IBAction if you want the button after the video disappears.
// BackMuscles.swift
// Messenger
//
// Created by Zach Smith on 8/12/21.
// Copyright © 2021 spaceMuleFitness. All rights reserved.
//
import UIKit
import AVKit
import AVFoundation
class BackMuscles: UIViewController {
#IBOutlet weak var playv: UIButton!
let avPlayerViewController = AVPlayerViewController()
var avPlayer:AVPlayer?
override func viewDidLoad() {
super.viewDidLoad()
self.view.addBackground()
let movieUrl:NSURL? = NSURL(string: "https://firebasestorage.googleapis.com/v0/b/messenger-test-d225b.appspot.com/o/test%2FTestVideo.mov?alt=media&token=bd4ccba3-b446-43bc-809e-b1152aa3c2ff")
if let url = movieUrl {
self.avPlayer = AVPlayer(url: url as URL)
self.avPlayerViewController.player = self.avPlayer
}
NotificationCenter.default.addObserver(self, selector: #selector(playerDidFinishPlaying), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: avPlayerViewController.player?.currentItem)
self.present(self.avPlayerViewController, animated: true) { () -> Void in
self.avPlayerViewController.player?.play() // Do any additional setup after loading the view.
}
}
#objc func playerDidFinishPlaying(note: NSNotification) {
self.avPlayerViewController.dismiss(animated: true)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
#IBAction func playV(sender: UIButton) {
let amovieUrl:NSURL? = NSURL(string: "https://firebasestorage.googleapis.com/v0/b/messenger-test-d225b.appspot.com/o/test%2FTestVideo.mov?alt=media&token=bd4ccba3-b446-43bc-809e-b1152aa3c2ff")
if let aurl = amovieUrl {
self.avPlayer = AVPlayer(url: aurl as URL)
self.avPlayerViewController.player = self.avPlayer
self.present(self.avPlayerViewController, animated: true) { () -> Void in
self.avPlayerViewController.player?.play()
}
}
}
}
If you want to create a YT like app, you can first compress the video, I recommend using this library to manage video compression, i recommend the one in this link. I've manage to compress a video of 118 mg to 6 mg in under 42 seconds. It also has a great demo app, just follow the example.
After you get the compressed file upload the file to Storage, in you client app you will play the video url using a player like Exo Player.
The video below is pretty good it uses exoplayer to stream instead of mediaplayer or videoViewLanscape
https://www.youtube.com/watch?v=s_D5C5e2Uu0
try{
BandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
TrackSelector trackSelector = new DefaultTrackSelector(new AdaptiveTrackSelection.Factory(bandwidthMeter));
simpleExoPlayer = ExoPlayerFactory.newSimpleInstance(this, trackSelector);
String vid = "https://www.youtube.com/watch?v=s_D5C5e2Uu0";
Uri uri= Uri.parse(vid);
DefaultHttpDataSourceFactory dataSourceFactory = new DefaultHttpDataSourceFactory("exoplayer_video");
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
MediaSource mediaSource = new ExtractorMediaSource(uri,dataSourceFactory,
extractorsFactory, null, null);
videoView.setPlayer(simpleExoPlayer);
simpleExoPlayer.prepare(mediaSource);
simpleExoPlayer.setPlayWhenReady(true);
}
catch (Exception e){
}
The below is the implements in the buid gradle app file that you will need.
implementation 'com.google.android.exoplayer:exoplayer:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-core:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-dash:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-hls:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-smoothstreaming:r2.4.0'
implementation 'com.google.android.exoplayer:exoplayer-ui:r2.4.0'
To use firestorage to play videos, all you need is the full url to the video. You then pass this url into a video view or exoplayer. No full download is needed. The videoview will stream the content YT style
see first of all you need to understand firebase rules okay
like make it true for development mode
then you have to create a storage reference for storing videos in your app
okay
then you have to select video you want to upload okay
then use Uploadtask for creating a task
and upload video file in storage
now for retrieving video use exoplayer library in android
you can visit here for more
Can any one help me out on this problem. I am struggling to transfer the sqlite to watch os 2. If you have any example share with me or please give your suggestion on this
Have not tried this for sqlLite files but works with audioFiles.
What I did is:
Turn ON App Group both on watch extension and main project.
Place file in App Group Container
// I create the file there so my code is
NSURL * urlOut = [[NSFileManager defaultManager] containerURLForSecurityApplicationGroupIdentifier: APP_CONECTIVITY_ID];
urlOut = [urlOut URLByAppendingPathComponent:#"myfile.wav"];
Send via WCSession sendMessage the link, need to send as string so you will send urlOut.absoluteString.
Profit?
You should be able to use stuff like sendMessageData //reading the file as date beforhand.
and there is also this WCSessionFileTransfer.
But I haven't had a chance to try that yet.
First off, this question has been covered a few times (I've done my research), and, for example, on the right side of the SO webpage is a list of related items... I have been through them all (or as many as I could find).
When I publish my pre-compiled .NET web application, it is very slow to load the first time.
I've read up on this, it's the JIT which I understand (sort of).
The problem is, after the home page loads (up to 20 seconds), many other pages load very fast.
However, it would appear that the only reason they load is because the resources have been loaded (or that they share the same compiled dlls). However, some pages still take a long time.
This indicates that maybe the JIT needs to compile different pages in different ways? If so, and using a contact form as an example (where the Thank You page needs to be compiled by the JIT and first time is slow), the user may hit the send button multiple times whilst waiting for the page to be shown.
After I load all these pages which use different models or different shared HTML content, the site loads quickly as expected. I assume this issue is a common problem?
Please note, I'm using .NET 4.0 but, there is no database, XML files etc. The only IO is if an email doesn't send and it writes the error to a log.
So, assuming my understanding is correct, what is the approach to not have to manually go through the website and load every page?
If the above is a little too broad, then can this be resolved in the settings/configuration in Visual Studio (2012) or the web.config file (excluding adding compilation debug=false)?
In this case, there are 2 problems
As per rene's comments, review this http://msdn.microsoft.com/en-us/library/ms972959.aspx... The helpful part was to add the following code to the global.asax file
const string sourceName = ".NET Runtime";
const string serverName = ".";
const string logName = "Application";
const string uriFormat = "\r\n\r\nURI: {0}\r\n\r\n";
const string exceptionFormat = "{0}: \"{1}\"\r\n{2}\r\n\r\n";
void Application_Error(Object sender, EventArgs ea) {
StringBuilder message = new StringBuilder();
if (Request != null) {
message.AppendFormat(uriFormat, Request.Path);
}
if (Server != null) {
Exception e;
for (e = Server.GetLastError(); e != null; e = e.InnerException) {
message.AppendFormat(exceptionFormat,
e.GetType().Name,
e.Message,
e.StackTrace);
}
}
if (!EventLog.SourceExists(sourceName)) {
EventLog.CreateEventSource(sourceName, logName);
}
EventLog Log = new EventLog(logName, serverName, sourceName);
Log.WriteEntry(message.ToString(), EventLogEntryType.Error);
//Server.ClearError(); // uncomment this to cancel the error
}
The server was maxing out during sending of the email! My code was fine, but, viewing Task Scheduler showed it was hitting 100% memory...
The solution was to monitor the errors shown by point 1 and fix it. Then, find out why the server was being throttled when sending an email!
I am having the hardest time figuring this problem out. I have a Silverlight 4 application that loads audio and video files from URLs. The URLs are the same domain as the application is hosted on and it works great for video.
The URLs are actually asp.net mvc controllers that are responsible for reading the file from a shared location on and the server and serving back a filestream. The URLs look something like this:
http://localhost:31479/CourseMedia?path=\omnisandbox1\ILMSShare2\Demo-Fire+Behavior\media\Disclaim.wma&encrypted=False&id=00000000-0000-0000-0000-000000000000
If I put the URL directly into the browser the file loads and plays in windows media player just fine, and if I use a separate test silverlight project to load the url it also works, but for the life of me I can not get it to work properly in my main project.
This is the routine I use to actually do the source setting:
protected void SetPlayerURL(MediaElement player, string url)
{
if (player != null && url.Length > 0)
{
player.ClearValue(MediaElement.SourceProperty);
player.Source = new Uri(this.Packet.GetMediaUrl(url, false, Guid.Empty));
}
}
and the GetMediaURL function simply builds the URL format seen above:
public string GetMediaUrl(
string path,
bool encrypted,
Guid key)
{
StringBuilder builder = new StringBuilder();
builder.AppendFormat("http://{0}/CourseMedia?path={1}&encrypted={2}&id={3}",
this.Host,
System.Windows.Browser.HttpUtility.UrlEncode(path),
encrypted,
key);
return builder.ToString();
}
The request to the controller is never made for the media when it is audio. Seems odd to me as this exact code works fine for video. The MediaElement state never leaves "Closed" and the CurrentStateChanged,, MediaOpened, and MediaFailed events are never triggered.
I am at a loss!
Try setting ScrubbingEnabled of the MediaElement to false, there were some problems with Framework version 3.5 and audio and the workaround was setting that to false. Might be worth trying.
Also try capturing BufferingStarted, BufferingEnded, MediaEnded along with your MediaFailed and MediaOpened events. I'm curious if it is a buffering issue.
For an Asp.Net software as a service application, I want to do account based subdomains like Basecamp and the rest of the 37Signals products have. E.g. acme.myapp.com will load the account for that customer and pull back only their information.
This is easy to do in Ruby on Rails, but how would you handle this functionality in ASP.NET MVC and be able to scale to possibly hundreds of accounts?
Maarten Balliauw's blog covered one method extending RouteBase. I think I've also seen a custom route handler used for this.
Also, this StackOverflow question covered the same question, using a more simplistic approach.
I definitely recommend factoring this code out into the routing side rather than embedding the logic to get domain information in your controllers.
We use:
public static string GetSubDomain()
{
string subDomain = String.Empty;
if (HttpContext.Current.Request.Url.HostNameType == UriHostNameType.Dns)
{
subDomain = Regex.Replace(HttpContext.Current.Request.Url.Host, "((.*)(\\..*){2})|(.*)", "$2");
}
if (subDomain.Length == 0)
{
subDomain = "www";
}
return subDomain.Trim().ToLower();
}
It's not very different compared to RoR. Just get the HTTP-Request, take the Host-Value, split it (at each dot) and take the first part to get the subdomain.
string subdomain = requestContext.HttpContext.
Request.Headers["Host"].Split('.')[0];
Then just resolve the subdomain to the Companies account.