Unable to train model using Microsoft CustomSpeech service - microsoft-cognitive

A half year ago I've already successfully trained multiple models using 'Audio + human labeled transcript' data in Microsoft speech portal. Now I wanted to train a new model but I only get the error message:
The base model isn't valid for this operation.
I tried to train the baseline model '20200115' and the older one '20190927'. Both failed with the error message above.
I already checked my audio data, which fits the requirements. The audio files are wav-files, 16-bit PCM, 1 channel.
I've also tried to create a new model using the old data which I had used to create the currently working model. This data is still available in speech portal so I had not to upload it again. For this I used the same baseline model '20190927' and got the error message too.
So Microsoft, is there anything you've changed? Is the website about data requirements still up to date? Or is this a bug?
What can i do to train a new model?

It's a known issue and a hotfix is currently being deployed. Because of the current situation the deployments takes longer than usual and will complete until the end of the week.

Related

R shiny keep dropping offline while running long process

I'm writing to seek help on the issues in R shiny in relation to the long running process. My codes involved running several simulations which takes about 1 min to run in R console. However, when i run it in the Shiny app, the app just keeps dropping offline at about 20 seconds of running. Does anybody know the reason for this problem and how to fix it?
It would be really helpful to get some insights and potential solutions on this. Thank you.
--- Further edit below: hope the information below would be useful to clarify the problem.
The error message is not a real error message: the app just simply drops offline and asks to reload the page. Here is the app that we developed:https://glasgowhehta.shinyapps.io/MetaInsight_continuous_Bayesian/
(Apologies, due to the nature of the issue we could not think how to give a simple example.)
If you click the ‘Data analysis’ tab - ‘3. Bayesian network meta-analysis’ tab - ‘3d. Nodesplit model’ tab, then click the button ‘Click here to run the main analysis for all studies’, it will drop the app offline before generating results (but this will take nowhere near as long as it would take for the server to disconnect due to user inactivity). The programme behind this button is a series of simulation evaluations using JAGS (which uses Gibbs Sampling and is computationally intensive and can produce a lot of data https://en.wikipedia.org/wiki/Just_another_Gibbs_sampler).
When only a small number of simulations are required, results could be rendered, however, when a larger number of simulations are required, the app will drop offline. You could test this by ticking some studies out from the sidebar, say ticking all the studies from ‘Kuo 2006’ down until ‘Derosa 2005’, which will result in only a small proportion of the data entering the ‘nodesplitting analysis with studies excluded’. After ticking these studies out, click the button on the right hand side, ‘Click here to run the nodesplitting analysis with studies excluded’, then the results should be rendered.
So our questions are:
1) Why does the app drop offline when the background computations are more extensive (i.e. is it a memory or a time issue or another altogether)?
2) What can be done to get round this problem (we have tinkered with the server settings but without success)?
We would be very grateful for any advice.

How to limit the request parameters in Azure Machine Learning

I'm stuck with web services in Azure ML :/
I am setting up a web service with Azure Machine Learning to estimate a car price based on 5 attributes out of 150 in my database. It works fine in the way that if I provide in the test endpoint 5 attributes out of the 150 it requires, it gives me a valid answer. As you can see below "Scored Label : 10185....".
My question is the following : how do you get the web service to only require 4 input ? The ones I want are in the output (gearingType,MakeTxt,mileage,modelTxt). Price, is off course what I try to guess.
Thanks for any help!
Regards,
Alexandre
Here is what my experience looks like, as you can see I used "Select Columns in Dataset" to select my 4 input + 1 output
Assuming that you're not doing any pre-processing on your price column, then you need to remove the price from your Import Data before joining with the Web Service Input, in your predictive experiment.
This is because ML Studio uses the structure from your training dataset to determine the structure of the Web Service Input (see MSDN for more info).
Here's a light sample of how it could look (note that this particular sample was doing some light pre-processing on the price column, which I had to remove in order for it to work).

Aged Debt Report on Sage200 - LinqDataProvider

Im trying to pull sql script that builds Aged Debt Report on Sage200 Report Designer. This report is out of the box
I have a powerbi model which pulls data from Sage Datawarehouse, however I am struggled to replicate the above report into powerbi because I couldnt access sql behind this report. When open this report in Sage200 Report designer, I could see that the data source is of LinQProvider with the connection string
LinqDataProvider
Sage200Accounts
Data Model=Sage 200 Accounts;Root Path=\SERVERNAME\Sage\REPORTING;Report Types=\SERVERNAME\Sage\REPORTING\DEFAULT\REPORTS\Sage200Accounts.reporttypes;Default Report=\SERVERNAME\Sage\REPORTING\DEFAULT\DEFAULTS\Default.report;Timeout=1800
I couldnt locate where this data model lives or I have enough understanding of how this report connects to the data model.
Anyone with Sage200 experience I would be so grateful if you can advice please? Basically I have access to SQL server datawarehouse, but the above report has some "variables" that somehow were built and saved inside this Data Model.
Appreciate your help
The report designer is reading the datamodel from %userprofile%\appdata\local\sage\sage200\assembly in Sage.accounting.datamodel.dll
The model source is available on the server however in SLAgedTransaction.cs however it is protected by licence agreement so I can't post it.
In Sage 200 2011 it was possible to use the datamodel through LINQPad which may still work in the current version. With this in mind perhaps it would be a better idea to connect PowerBI to the datamodel rather than trying to extract SQL script? Also take a look at the good work Pan Intelligence are doing with regards to BI reporting

Issue exists in auto generation of number sequence AX

I have been trying to import customer master through data management tool in AX7 using "Customers" standard data entity, I have marked "Auto-Generate" for customer account field. And I am facing a number sequence error while the data gets inserted into staging. When I check the execution log I see the below error.
"Issue exists in auto generation of number sequence
Issue exist in generate staging data
'4' 'Customers' record(s) inserted in staging"
I checked number sequence setup for Customer account and it is proper it is as below:
Note:
Gives the same error irrespective of Continuous is marked or not for the number sequence code.
Any quick inputs would be appreciated!
Thanks Fh-Inway!
I have figured out the issue, and it's an issue with standard AX. An application hot fix (Metadata) is available for this which can be found in the LCS as a part of AX update2. I have installed it, tested it to be working fine.
Not sure whether I can share the Hotfix KB Article number here for the same.
Note: That hotfix has a common fix that addresses auto-generation of number sequences across all the data entities.

Date Filtering Malfunction in Powerview

I am building a reporting solution using Powerview on Sharepoint Server 2013 with ssas multidimensional data source.
On the powerview reports I have encountered a strange problem. When I filter using a date attribute with multiple values only the non calculated facts (simple measures, not mdx) are being filtered, while for single date selection everything is filtered correctly.
The strange thing is that in the cube browser everything works fine for all facts and all dates selections.
Any idea would be highly appreciated.
Thanks!
Sample Calculations :
CREATE MEMBER CURRENTCUBE.Measures.NewRequestsCount
as
aggregate({[DM RMS Workflow Actions].[Standard Action FK].&[13],[DM RMS Workflow Actions].[Standard Action FK].&[1]},[Measures].[FC RMS Request Actions Count]),
ASSOCIATED_MEASURE_GROUP='FC RMS Request Actions',format_string="#,##0";
and
CREATE MEMBER CURRENTCUBE.Measures.ForwardsCount
as
aggregate(([DM RMS Workflow Actions].[SN].&[62],[DM RMS Is Fake].[Value].&[Real]),[Measures].[FC RMS Request Actions Count]),
ASSOCIATED_MEASURE_GROUP='FC RMS Request Actions',format_string="#,##0";
Finally I found a solution.
The problem was solved by installing Sql Server 2012 sp3
https://www.microsoft.com/en-us/download/details.aspx?id=49996.

Resources