Stanford dataset to CoreML - coreml

I have a dataset, downloaded from link.
I know about coremltool (created by Apple).
The question is:
is it possible to convert Stanford dataset to CoreML ?
If yes - can somebody give me instructions ?
Thanks in advance !

This question is asked so often, that finally I've decided to draw a diagram.
Explanation:
Dataset is a "fuel" that you put into your model to make it work.
Model is a machine learning algorithm: neural network, decision tree etc.
Supported ML frameworks and models are listed here together with the instructions for conversion.

You can make your own .mlmodel file using your own data set with a python script and a python library called coremltools. You can train your model using sklearn, keras, etc. and can customize what is uses to train like SVM, kNN, regression, and so on. Then you save it as a .mlmodelfile and drop that into your project. This video is helpful:
https://youtu.be/T4t73CXB7CU

Related

How to do Multitask Learning in R-Keras?

I'm dealing with an image classification problem in which I have different types of labels for every observation. I'm trying to do multitask learning using R keras. The only problem is that I have to use the keras 'Data Loader' because the dataset is huge and I don't know how to use the flow_images_from_directory function for multi-output models. Please someone help me.

Difference between "mlp" and "mlpML"

I'm using the Caret package from R to create prediction models for maximum energy demand. What i need to use is neural network multilayer perceptron, but in the Caret package i found out there's 2 of the mlp method, which is "mlp" and "mlpML". what is the difference between the two?
I have read description from a book (Advanced R Statistical Programming and Data Models: Analysis, Machine Learning, and Visualization) but it still doesnt answer my question.
Caret has 238 different models available! However many of them are just different methods to call the same basic algorithm.
Besides mlp there are 9 other methods of calling a multi-layer-perceptron one of which is mlpML. The real difference is only in the parameters of the function call and which model you need depends on your use case and what you want to adapt about the basic model.
Chances are, if you don't know what mlpML or mlpWeightDecay,etc. does you are fine to just use the basic mlp.
Looking at the official documentation we can see that:
mlp(size) while mlpML(layer1,layer2,layer3) so in the first method you can only tune the size of the multi-layer-perceptron while in the second call you can tune each layer individually.
Looking at the source code here:
https://github.com/topepo/caret/blob/master/models/files/mlp.R
and here:
https://github.com/topepo/caret/blob/master/models/files/mlpML.R
It seems that the difference is that mlpML allows several hidden layers:
modelInfo <- list(label = "Multi-Layer Perceptron, with multiple layers",
while mlp has one single layer with hidden units.
The official documentation also hints at this difference. In my opinion, it is not particularly useful to have many different models that differ only very slightly, and the documentation does not explain those slight differences well.

Apply Ensemble for timeseries forecasting

I'm using multiple timeseries models like ARIMA, holtwinters, prophet. Now I want to do ensemble of all this and produce the results . I need suggestions what is the best way to apply ensemble on timeseries. Please help. I'm new to this.
There is a recent package called tsensembler (full disclose, I am the author).
link for documentation with useful examples
link for github
It essentially trains a set of regression models for predicting the next value(s) of the time series, and combines them automatically using a metalearning approach. The scientific basis was presented in the conference ECML-PKDD2017, and it won the best student machine learning paper award.
I suggest exploring the opera package.
install.packages("opera")
Here is the vignette:
https://cran.r-project.org/web/packages/opera/vignettes/opera-vignette.html
The following link provides sample code and walks through using two ensemble packages in R. The packages are 'opera' and 'forecastHybrid'.
Opera & forecastHybrid

How to retrain model using old model + new data chunk in R?

I'm currently working on trust prediction in social networks - from obvious reasons I model this problem as data stream. What I want to do is to "update" my trained model using old model + new chunk of data stream. Classifiers that I am using are SVM, NB (e1071 implementation), neural network (nnet) and C5.0 decision tree.
Sidenote: I know that this solution is possible using RMOA package by defining "model" argument in trainMOA function, but I don't think I can use it with those classifiers implementations (if I am wrong please correct me).
According to strange SO rules, I can't post it as comment, so be it.
Classifiers that you've listed need full data set at the time you train a model, so whenever new data comes in, you should combine it with previous data and retrain the model. What you are probably looking for is online machine learning. One of the very popular implementations is Vowpal Wabbit, it also has bindings to R.

Is it possible to create a new model in RNetLogo or change the source code of sample models?

I am pretty new to the RNetLogo package in R, but so far all the examples of using RNetLogo I saw were about loading model samples and doing something with them. I did not see any examples which show that we can create our own model and write down rules according to which our agents will interact with each other (or see the code of sample models and change it). Is it possible to write these rules in R or does RNetLogo allow us to play with already implemented models (samples) only without changing the code?
For example, when we open in NetLogo Models Library-->Earth Science-->Climate Change (just random example) then we can go to the Code tab and see the code written in NetLogo prog.language:
globals [
sky-top ;; y coordinate of top row of sky
...
My question is: can we see this code in R and change it?
My answer, I do not think so :-) . You need to developpe your model in netlogo and with RNetlogo in R you can run it, play with data, send some data.frame to your model, change some variable.

Resources