In ios13 beta apple support on device speech recognition.
In documentation I observed to set "requiresOnDeviceRecognition" property as true, then audio will be converted as text in device only.
But whenever I set this property as "Yes" , I always getting this error :
Error Domain=kAFAssistantErrorDomain Code=1103 "No models installed yet" UserInfo={NSLocalizedFailureReason=No models installed yet}
NSLocalizedFailureReason = "No models installed yet";
Please ensure that you're passing the correct locale in the SFSpeechRecognizer.
For example, I live in India, my current device language is English(India), so I've initialized the SFSpeechRecognizer in the following way:
private let speechRecognizer = SFSpeechRecognizer(locale: Locale(identifier: "en_IN"))!
This should work for you!
The problem is that you didn't check the SFSpeechRecognizer's supportsOnDeviceRecognition. If it is false you cannot ask for on-device recognition.
Related
i'm following that blog entrance, https://agileapp.co/Stripe-and-ApplePay-with-Xamarin-Forms/, that tells us how to integrate Stripe with Apple Pay using Xamarin Forms.
I follow all code, and the only thing I changed is in PaymentButtonRenderer, CountryCode and CurrencyCode properties of PKPaymentRequest, set for use in Spain (ES and EUR), and of course, MerchantIdentifier as I set in apple developer portal.
All things I have to do in Stripe are already done.
As you can see in the images below, payment is not finished, and I don't know why. I don't know where I have to set the pay using Stripe (var myCharge = new ChargeCreateOptions()....), maybe in PCL ViewModel, in PaymentWillAuthorize or PaymentDidAuthorize... but I have two breakpoints and PaymentDidAuthorize never breaks, only PaymentWillAuthorize , but I don't know how I have to code in these methods.
Another extrange thing, is that if I use MerchantCapabilities as PKMerchantCapability.ThreeDS, I have an error message that tells that "Apple Pay is not available in my App", but if I change it to PKMerchantCapability.Debit, then the payment not finish.
Could you tell me what is wrong??
Thanks
Everything works fine, I just only create apple merchant certificate again, and code does what it have to do.
Problem solved, thanks #karllekko
In my unity project I want play custom sound when I get firebase cloud message, not system default sound.
So after I followed other answers my message looks like,
{
"to": "some_key",
"notification": {
"title": "Title",
"android_channel_id": "2",
"body": "Body",
"sound": "custom_sound.wav"
}
}
and I placed custom_sound.wav in Asset/Plugins/Android/res/raw. When I unzip my .apk, I can find my sound file is in right location.
But it keeps playing system default sound. Even after I remove sound field. Is there any other thing should I check?
First: a quick tip when debugging. If you select "Export Project", you can open the generated Gradle project with Android Studio:
Occasionally you have to update the gradle wrapper, but it helps a ton debug things like "is my sound file in res/raw" without having to decompress your APK and poke around.
I think that the issue you're running into now is that sounds are now associated with NotificationChannels (as of Android O) rather than individual notifications, as noted by this StackOverflow post expressing a similar issue. Since this isn't exposed via the Unity SDK.
Fortunately, you can add a channel with Unity.Notifications.Android.
It should be as simple as creating a new
public AndroidNotificationChannel(string id, string title, string description, Importance importance)
with your id set to "2" (to match your sample notification above. Since this is a string, I would recommend giving this a better name :D).
Then you can call RegisterNotificationChannel with that channel you create as your parameter.
For example, to get your notification above to work, I believe you can write:
var notificationChannel = new NotificationChannel("2", "Channel 2 (working title)", "This is the 2nd channel", Importance.Default);
AndroidNotificationCenter.RegisterNotificationChannel(notificationChannel);
Let me know if this helps!
--Patrick
I created a fresh iOS Single Page App (including SwiftUI) with Xcode 11.1 and enabled Mac Catalyst. After running the fresh Project on my Mac (macOS 10.15 of course) I get the following errors after tapping once on the window.
2019-10-18 12:59:48.479186+0200 test[3130:122148] Metal API Validation Enabled
2019-10-18 12:59:50.960734+0200 test[3130:122148] [AXRuntimeCommon] Unknown client: test
2019-10-18 12:59:50.962261+0200 test[3130:122148] [AXRuntimeCommon] This class 'SwiftUI.AccessibilityNode' is not a known serializable element and returning it as an accessibility element may lead to crashes
2019-10-18 12:59:51.313 test[3130:122148] **************_____________**************AXError: AVPlayerView is not a kind of NSView
1 AccessibilityBundles 0x00007fff42ee3b69 _AXBValidationCheckIsKindOfClass + 201
2019-10-18 12:59:51.386 test[3130:122148] **************_____________**************AXError: MKStarRatingView is not a kind of NSView
1 AccessibilityBundles 0x00007fff42ee3b69 _AXBValidationCheckIsKindOfClass + 201
Note: I also removed the Sandbox capability otherwise I get error about can't writing ApplicationAccessibilityEnabled
Does anyone know how to solve that?
As far as I can tell, there isn't a way to get rid of that error, and there isn't a need to; it's something inherent in SwiftUI. It occurs on iOS, iPadOS, and (therefore) Mac Catalyst, even in a brand new project. It also doesn't seem to hurt anything, other than to worry us developers.
I've been working in SwiftUI for the past six months full-time on an app that is now in production running on iOS, iPadOS and MacOS (Catalyst). The This class 'SwiftUI.AccessibilityNode' is not a known serializable element error has been there since the beginning. I haven't traced it to be the source of any problem in six months of SwiftUI development.
If you open Xcode, create a new single-view iOS project, and run it without change, it'll display "Hello, World!". Click "Hello, World!" and your console will log [AXRuntimeCommon] This class 'SwiftUI.AccessibilityNode' is not a known serializable element and returning it as an accessibility element may lead to crashes.
I've tried adding accessibility modifiers, e.g.:
struct ContentView: View {
var body: some View {
Text("Hello, World!")
.accessibility(hint: Text("Just say hi"))
.accessibility(identifier: "helloWorld")
}
}
The error still gets logged when I click "Hello, World!".
I've also tried extending SwiftUI.AccessibilityNode to make it a serializable element, e.g.:
import SwiftUI
extension SwiftUI.AccessibilityNode {
}
Xcode says type SwiftUI.AccessibilityNode doesn't exist.
If you find them annoying as I do you can silence them as mentioned in this answer:
Hide strange unwanted Xcode logs
I need to automate to check the status(Disconnected/connected/Trying to connect/Need Password) of outlook without using Outlook UDF.
Below is the code i am trying
Global $iPID = "C:\Program Files (x86)\Microsoft Office\Office14\outlook.exe"
Sleep(600)
Run($iPID, "", #SW_SHOWMAXIMIZED)
Sleep(6000)
;check the Status
$status =
Could some one please help me out check the status ??
About the ONLY option I can think of is to use hardcoded coordinates to find if the red X in the lower right when offline is true.
You can use the AutoIt Window Info tool to get the coordinates and the color.
Yesterday I got the update for Android 5.0 on my Nexus 4, and the altbeacon library stopped detecting beacons. It appears that didEnterRegion and didRangeBeaconsInRegion are not even getting called when monitoring and ranging, respectively.
Even the Locate app from Radius Networks behaves differently now, the values from beacons, once they are detected, doesn't get updated anymore and often it appears as if the beacons went out of range.
One thing I noted differently, is that now in the logcat it appears the following line "BluetoothLeScanner﹕ could not find callback wrapper". I went ahead and looked for that class and saw that it was introduced with Android L, but I don't know if that has something to do with it.
It's important to say that before the update I had been working with both the Locate app and the Reference Application without any trouble.
I don't know if this is a generalized problem or not, but if it happened to me I'm sure it could happen to someone else, so any help it would really be appreciated.
Thanks in advance!
UPDATE:
After failing at getting the library to work I decided to try the Android L branch of the library. What I did was that I plugged in the new library into the Reference App, but didn't work as expected either.
The Monitor Activity seems to be working ok by notifying when the device has entered a new region. However, the Ranging Activity doesn't report any beacons, although didRangeBeaconsInRegion is getting called, always report zero beacons. Curiously, when the activity is paused (switching momentarily to another app) the logcat shows that now didRangeBeaconsInRegion does get called with actual beacons.
I'm kind of stuck right now because I don't know how to get any of libraries working on Android L, so again, any help would really be appreciated.
I'm using the latest Altbeacon build on 5.0+ and have no problem with it. in fact, I never used it on kitkat so i'm not really sure i can help but here is my working code which listen to iBeacons.
implement beaconConsumer:
public class MainActivity implements BeaconConsumer
init BeaconManager
beaconManager = BeaconManager.getInstanceForApplication(this);
if (beaconManager != null && !beaconManager.isBound(this)) {
beaconManager.getBeaconParsers().add(new BeaconParser().
setBeaconLayout("m:0-3=4c000215,i:4-19,i:20-21,i:22-23,p:24-24"));
beaconManager.bind(this);
}
onConnect and start listner
#Override
public void onBeaconServiceConnect() {
beaconManager.setRangeNotifier(new RangeNotifier() {
#Override
public void didRangeBeaconsInRegion(Collection<Beacon> beacons, Region region) {
if (beacons.size() > 0) {
Beacon firstBeacon = beacons.iterator().next();
}
}
});
beaconManager.startRangingBeaconsInRegion(new Region("com.example.app", null, null, null));
}
this code is working on 3 devices
Nexus 4 5.0.1
Samsung Galaxy s4 - Stock 5.0.1
Samsung Galaxy s4 - CM12 5.1.1
Old question, but maybe some people will try to find answer for higher systems where you have to ask for permissions. You need to ask for Manifest.permission.ACCESS_FINE_LOCATION before scanning. At least that was the problem I met. In my opinion lib should crash such cases at least and indicate the problem