App Declining in app updates through app center - xamarin.forms

I set up Microsoft update according to the app centre documentation however I am getting the following dialogue.
https://learn.microsoft.com/en-us/appcenter/sdk/distribute/xamarin
I am using android 9.1 i used the normal download function
Steps to reproduce
1 I had to install new version on Phone so downloaded via email
2 Sent a new version to app centre for testing the below
3 Presented with that dialogue when I started the app
I places this in my OnCreate in MainActivty.cs
AppCenter.Start("{secret key}", typeof(Distribute));
And the following in my App.Xaml
AppCenter.Start("android=secret-key;",
typeof(Distribute), typeof(Distribute));
I also placed this in OnStart
Distribute.SetEnabledAsync(true);
Can someone tell me why I am getting the below dialogue
My App is also in release mode

You should not have two calls to Start in different places and two Distribute types passed.
If you have a Xamarin.Forms app, call to Start should happen
inside App.xaml.cs.
If you have a Xamarin.Android app, call to Start should happen
inside MainActivity.cs.
Not the both!
typeof(Distribute) should be passed only once.
Please refer to this part of documentation.
Also, calling setEnabled is unnecessary.

Related

MRTK: How to fix SpectatorView from Android-device not pairing with HoloLens? (QR-code)

Problem summary
I'm attempting to establish a connection between HoloLens and an Android device, which worked sporadically in the beta-version of the MRTK.
However since moving to the MRTK RC1 (Also Refresh), I've encountered issues regarding the QR-scanning. When pressing connect, the two devices seemingly finds eachother, however when the wearer of the HoloLens1 looks directly into the QR-code, nothing happens. (the white dot and "Locating marker..." text is showing)
Background summary
1. The Setup:
Implemented working MRTK RC1 Refresh
Cloned Feature-SpectatorView separately, copying only the "MixedRealityToolkit.Extensions" folder to the MRTK project.
"Spectator View - HoloLens" prefab added to scene.
First pressing "HoloLens" in the PlatformSwitcher, building for HoloLens1, then switching to "Android" and exporting the project to Android Studio.
Building the .apk from Android Studio
(opencv binaries are downloaded and implemented since beta version, I haven't since changed them from when they worked the last time.)
2. The Proces:
On the HoloLens, I press the "Connect" button in which a white text appears saying "Locating Marker..."
The Android phone presses connect and it goes to "Waiting for User" then as soon as a HoloLens is connected, it switches immediately to a QR code that should be readible from said HoloLens.
Looking directly at the QR-code and nothing new happens, connection does not establish further.
I checked if something was not ticked in Player Settings/Capabilities, but I can't seem to find what the culprit would be. Did I forget something in this proces?
There are a few things that could be causing this issue.
If the Android device is showing a marker, this means the two devices have established a network connection and are communicating with one another. Typically, when I run spectator view I enable the following capabilities: "Internet (Client & Server), Internet (Client), Microphone, Pictures Library, Private Networks (Client & Server), Spatial Perception, Videos Library, Webcam" in the Package.appxmanifest in visual studio. Pressing "HoloLens" on spectator view's unity platform switcher should typically achieve enabling these capabilities, but sometimes the package.appxmanifest doesn't get updated correctly in the visual studio project with subsequent builds in Unity. You can fix this by deleting your visual studio directory and rebuilding the visual studio project in unity.
If these capabilities are checked in the package.appxmanifest, it may be that you rejected a capability request when first running the application. If you open Settings -> Privacy -> Camera on the HoloLens, you can check whether your deployed spectator view application has camera access granted. You should be able to enable the camera functionality here if it is disabled.
There have been changes to both MixedRealityToolkit and MixedRealityToolkit-Unity spectator view logic, so cloning these items at different points in time may cause functions to no longer resolve (We're hoping to consolidate this code into the same repo/commit history in the future to prevent this from continuing to happen). Typically, in the Unity logs there will be errors stating that a function was not found for SpectatorViewPlugin.dll if the dll functionality is not resolving correctly. It sounds like this is not the issue you are hitting if things worked previously. But if it does turn out the case, it may be that you need to rebuild the SpectatorViewPlugin.dll to match the feature/spectatorView code you are using.
If you recently copied the SpectatorViewPlugin.dll and its dependencies to a new unity project, it may be that they aren't getting registered as usable by the windows uwp unity player. Make sure these binaries are in a Plugins\WSA\x86 folder within your assets folder. Also check the *.dll.meta definitions in the unity inspector to ensure the dll's are declared as usable for the unity wsa player/x86 builds.

xamarin ios : next is gray out in sign and distribute process for IOS

I am trying to publish my app to the App Store following this documentation https://learn.microsoft.com/en-us/xamarin/ios/deploy-test/app-distribution/app-store-distribution/publishing-to-the-app-store?tabs=vsmac
But i have reached to sign and distribute step and i can't press next(gray out). So please can someone tell me why? and how can solve this issue?
pleae click here to see picture describing my problem
As you can see in the image you attached, you actually have 2 products, both need signing. One for your main app, and one for your extension. Select your extension, and specify a provisioning profile for that as well (you might have to generate one for that on the Developer portal if you haven't done so already!)

After integration firebase + gtm ,events is not shown in google analytics

I followed this link - https://developers.google.com/tag-manager/ios/v5/.
I can see my events in firebase console but not in google analytics.I defined my custom as well as default apple logevents.As it is also coming in my console "[Firebase/Analytics][I-ACS023024] No data to upload. Upload task will not be scheduled". I am using xcode 8.3.2 and swift3.Please help if anyone implemented.
There could be a few issues that can cause this. And more information is needed to find the cause, best place to look is the debug console in xCode.
GTM container is not setup correctly.
GTM Tags are not setup correctly.
The App/SDK checks for newer versions of the GTM container in 12 hour intervals. So if you did run the app for testing with a container with no tags, then added the tags to test for GA input, then the new container will not be used, until it gets the new container 12 or so hours later. (Work around for me was to uninstall the app from the simulator and run it again in xCode, just to be safe I clean the code too before running again. I did read somewhere that there was code to always check for newer containers, but I never got that to work)
Would be good if you can provide more information from the xCode debug console.

Turn on GPS when App starts in QT

So I was working on a positioning app which needs GPS to be turned on. So I wanted a way to automatically turn on GPS in the background, if even possible with high accuracy or atleast have a popup window come up so the user can turn on the GPS right away. But sadly I couldn't find a single way to do so in QT, if there is a solution it's always written in Java. Can I somehow do it in QT too or import Java code in it?
Would I also be able to keep my GPS updates running in the background? Because as soon as I press the home button the updates stop to come up in the console with qDebug... or is it just the qDebug function that can't run when the app is not open?
Will I be able to use GPS in my Qt app?
First, check if the target platform has GPS and/or capable of resolving geo-coordinates. Next check the manual out: Qt Location. Mind that they refer to this functionality as 'location' so it maybe partially available even without actual GPS unit on device if there is another provider type (I guess partially and not for all platforms). I could only find this list of platforms supported:
Qt Location Classes for accessing GPS and other location services and
for mapping and navigation. Split off from the Qt 4 Mobility module of
Qt Location. Supported on Android, BlackBerry, iOS, Linux (using
GeoClue), Windows and Sailfish OS.
As for starting the GPS (location services provider) there is such entry for QML for sure called start() of PositionSource. It also implies one can find the same functionality in C++ as well.
Would I also be able to keep my GPS updates running in the background? Because as soon as I press the home button the updates stop to come up in the console with qDebug... or is it just the qDebug function that can't run when the app is not open?
Home button: it implies Android? Unclear what you ask but the Android app lifecycle is a bit different matter than that. The GPS will be working independently of your app but will the app respond to messages is more determined by Android.

Messenger Register set twice, only receives message in one place, MVVM-Light

I've got a weird problem using MVVM-Light in a WP8.1 universal app. When the download folder has been changed, I am sending out a message via Messenger that contains the new StorageFolder download folder.
I am registered to receive this message in two different viewmodel constructors (one for the main page, one for a custom file manager page). Instances of both have already been created.
When I run this in my emulator, everything works as it should. I get the message on both viewmodels. However, when I run this on my phone (Lumia 920 with developer preview), I only get the message on the main page, not the file manager page that even sends the message (from the page's own codebehind).
At a glance I'm wondering if there's a thread problem here due to the difference in speeds... but I also wonder if there's a bug on the ARM side of the MVVM-light toolkit.
Based on how I was using the MessengerInstance.Register method, I was the victim of a limitation of the Messenger system (not a bug exactly). Someone else described it here exactly:
https://mvvmlight.codeplex.com/workitem/7640
Basically, I was using a lambda that included a parameter from outside the lambda statement, making it a class held with a weak reference. This was garbage collected and then gone forever. Fixed by rewriting the entire thing...

Resources