Xamarin Forms NAudio - xamarin.forms

I need some advice.
I found this ShazamApi project that uses NAudio. I need this project to work in Xamarin Forms (Android), but I don't know what to use instead of NAudio to make it work on Android. Someone would know how to do it. Thanks a lot.
Specifically, I have a problem with this code.
https://github.com/AlekseyMartynov/shazam-for-real/blob/master/Program.cs
https://github.com/AlekseyMartynov/shazam-for-real/blob/master/Analysis.cs
public void ReadChunk(ISampleProvider sampleProvider)
{
if(sampleProvider.Read(WindowRing, WindowRingPos, CHUNK_SIZE) != CHUNK_SIZE)
throw new Exception();
ProcessedSamples += CHUNK_SIZE;
if(ProcessedSamples >= WINDOW_SIZE)
AddStripe();
}
WasapiLoopbackCapture capture = new WasapiLoopbackCapture();
var captureBuf = new BufferedWaveProvider(capture.WaveFormat);
capture.DataAvailable += (s, e) =>
{
captureBuf.AddSamples(e.Buffer, 0, e.BytesRecorded);
};
capture.StartRecording();
using (var resampler = new MediaFoundationResampler(captureBuf, new WaveFormat(Analysis.SAMPLE_RATE, 16, 1)))
{
var sampleProvider = resampler.ToSampleProvider();
var retryMs = 3000;
var tagId = Guid.NewGuid().ToString();
while (true)
{
while (captureBuf.BufferedDuration.TotalSeconds < 1)
Thread.Sleep(100);
analysis.ReadChunk(sampleProvider);
if (analysis.StripeCount > 2 * LandmarkFinder.RADIUS_TIME)
finder.Find(analysis.StripeCount - LandmarkFinder.RADIUS_TIME - 1);
if (analysis.ProcessedMs >= retryMs)
{
var sigBytes = Sig.Write(Analysis.SAMPLE_RATE, analysis.ProcessedSamples, finder);
var result = ShazamApi.SendRequest(tagId, analysis.ProcessedMs, sigBytes).GetAwaiter().GetResult();
if (result.Success)
return result;
retryMs = result.RetryMs;
if (retryMs == 0)
return result;
}
}
}

but I don't know what to use instead of NAudio to make it work on
Android.
In xamarin forms, you can use Xamarin Community Toolkit MediaElement to achieve this function.
MediaElement is a view for playing video and audio. Media that's supported by the underlying platform can be played from the following sources:
The web, using a URI (HTTP or HTTPS).
A resource embedded in the platform application, using the
ms-appx:/// URI scheme.
Files that come from the app's local and temporary data folders,
using the ms-appdata:/// URI scheme.
The device's library.
There is also a official sample here: Xamarin Community Toolkit - MediaElement.
Note:
In android, you can also try Android Audio. The Android OS provides extensive support for multimedia, encompassing both audio and video.

Related

Xamarin Android use SetSound for Notification Channel to play custom sound on notification

I have I have been wasting at least a day trying to make this work. I am trying to play an mp3 file that i placed in Resources/raw once a notification is received. I do not know exactly how to get the Uri. My questions please are:
1.To play a custom file do you have to place it in Resources/raw or can it be also in Assets/Sounds under the Xamarin Android project.
2.How do i get the Uri correctly based on where the mp3 file resides.
This is my code:
private void createNotificationChannel()
{
var channelName = GetString(Resource.String.noti_chan_urgent);
var channelDescription = GetString(Resource.String.noti_chan_urgent_description);
// set the vibration patterm for the channel
long[] vibrationPattern = { 100, 200, 300, 400, 500, 400, 300, 200, 400 };
// Creating an Audio Attribute
var alarmAttributes = new AudioAttributes.Builder().SetUsage(AudioUsageKind.Alarm).Build();
// Create the uri for the alarm file
var alarmUri = Android.Net.Uri.Parse("MyApp.Android/Resources/raw/alarm.mp3"); // this must be wrong because its not working
// create chan1 which is the urgent notifications channel
var chan1 = new NotificationChannel(PRIMARY_CHANNEL_ID, channelName, NotificationImportance.High)
{
Description = channelDescription
};
// set the channel properties
chan1.EnableLights(true);
chan1.LightColor = Color.Red;
chan1.EnableVibration(true);
chan1.SetVibrationPattern(vibrationPattern);
chan1.SetSound(alarmUri, alarmAttributes);
chan1.SetBypassDnd(true);
chan1.LockscreenVisibility = NotificationVisibility.Public;
var manager = (NotificationManager)GetSystemService(NotificationService);
manager.CreateNotificationChannel(chan1);
}
}
I figured it out and I hope this will help someone better than getting a downvote for a question, this is how you do it:
(Note: Make sure you put your mp3 file in your Xamarin Android project under Resources/raw/soundFile.mp3 and build the file as Android Resource).
Then create the Uri like this:
Android.Net.Uri alarmUri = Android.Net.Uri.Parse(${ContentResolver.SchemeAndroidResource}://{Context.PackageName}/{Resource.Raw.soundFile}");
Create the Alarm Attributes like this:
var alarmAttributes = new AudioAttributes.Builder()
.SetContentType(AudioContentType.Sonification)
.SetUsage(AudioUsageKind.Notification).Build();
And finally setSound on the channel itself ONLY from Android Oreo onwards (not on the notification, create the channel at application launch):
chan1.SetSound (alarmUri, alarmAttributes);
uri = Android.Net.Uri.Parse(
"android.resource://" + Application.Context.PackageName + "/raw/sound2");
only change I had to make. to Fredsomofspeech answer.
android 9.
visualstudio 2019 xamarin.forms mobile ios android. sound2.mp3
was running a file android could not play, so make sure download a mp3 file for testing verified to work.

Detect default android browser by Wurfl

I want to detect default browser of android inside my MVC controller.
I have mobile detection right now:
public ActionResult Index()
{
if (WurflHelper.Instance.IsMobile(Request.UserAgent))
{
return View("MobileView");
}
else
{
return View();
}
}
How can i detect android default browser (not chrome). I need UserAgent parameters matches for this detection.
Thanks for advice.
--------------EDIT--------------------------------------------------------------
i found this solution for client (javascript) : How to detect the stock Android browser. I need same solution for asp.net MVC
I noticed that you are already using WURFL to detect the device. You could simply use the mobile_browser or advertised_mobile_browser capability to determine the browser of the device.
You can use this Request.Browser to get the browser info. Below is an example
var moo = Request.Browser;
var Name = moo.Browser;
var Version = moo.Version;
var Major_Version = moo.MajorVersion;
var Minor_Version = moo.MinorVersion;

youtube api updating video information using asp.net

I am using the Youtube api for asp.net to rename a video:
public static void UpdateVideoInfo(string video_id, string new_title)
{
Uri entry = new Uri("http://gdata.youtube.com/feeds/api/videos/" + video_id);
Video video = AuthRequest().Retrieve<Video>(entry);
if (video.ReadOnly == false)
{
video.Title = new_title;
}
else video.Title = video.Title;
Video updatedvideo = AuthRequest().Update(video);
}
but i get this error:
Object reference not set to an instance of an object
on the last line.
what am i doing wrong?
thanks
May be is the same problem reported here, with a different error message..
YouTube API .NET C# editing video problem

Embedding metadata time limit for Flash Builder 4.5.1mobile project?

I am working on a project that requires me to embed metadata on the fly with a recorded stream from a webcam. I am utilizing Flash Builder 4.5.1 creating a mobile project. I am using a simple netStream.send function to set the metadata I want. This works just fine until my netstream time goes over around 10 seconds, then the function ceases to work or will not embed into the video. All my connections are correct and I can record to the Flash Media Server
The only thing I can think of is that my Flash Media Server 4 Developer is being over loaded and does not compute the metadata I send.
Any ideas would greatly help.
private function sendMetadata():void {
infotxt.text += 'called';
trace("sendMetaData() called")
myMetadata = new Object();
myMetadata.customProp = "This message is sent by #setDataFrame.";
myMetadata.customOther = cueHolder;
ns.send("#setDataFrame", "onMetaData", myMetadata);
}
And here is my onMetaData function
public function onMetaData(info:Object):void {
trace("caught");
infotxt.text = 'caught';
var key:String;
for (key in info){
outputWindow.text +=(key + ": " + info[key] + "\n");
}
//cueHolderReturn = info.customOther;
for (var i:int = 0; i < info.customOther.length; i++)
{
infotxt.text += info.customOther[i]
}
//infotxt.text = info.customOther[0];
}
Just wondering - is this problem occuring on both a real mobile device and a mobile emulator? If not, it could be the mobile connection - HTH

Get the status of a live stream for a VideoDisplay control

I'm looking for a way to find the status of a live stream through a VideoDisplay (or any other method really). I am interested to know if the stream is currently being published to or if the publisher has stopped. This is for a Flex/Flash ActionScript 3 project.
Is there a way to do this or is this ANOTHER oversight by adobe?
flex flash adobe adobe-flex actionscript
I've only found one solution, and that's using the NetStream object in combination with a video control.
The video control must be manually added to an
nsListen = new NetStream(nc);
nsListen.addEventListener(NetStatusEvent.NET_STATUS, nsListenHandler);
nsListen.play(streamname);
var v:Video = new Video();
v.attachStream(nsListen);
uicontrol.add(v);
Finally, the event status is returned in nsListenHandler:
private function nsListenHandler(e:Event):void
{
if(e is NetStatusEvent)
{
var nse:NetStatusEvent = NetStatusEvent(e);
if(nse.info.code == "NetStream.Play.Failed")
{
// Big error.
}
if(nse.info.code == "NetStream.Play.PublishNotify")
{
// Stream has just been published
}
if(nse.info.code == "NetStream.Play.UnpublishNotify")
{
// Stream has just been unpublished
}
trace(NetStatusEvent(e).info.code);
trace(NetStatusEvent(e).info.description);
}
}
Only this code wont do is tell you if a stream is already successfully being published to.
You can dig into NetStatusEvent events.
Check this live docs

Resources