Do some of you have some resources to share in order have a starting point or a package for analyzing and reporting data in Chrome UX report with R?
I know that it's almost a matter of SQL and big-query but any tools to speed up the process will be useful
Thanks
Related
tl;dr I want to deploy "live" model results in Python and R, and while Salesforce Einstein advertises this functionality for R and Python, I have only found support for Python. Shiny is too expensive to justify for our limited R-language requirements. Does Einstein R support actually exist?
UPDATE: Tableau has a separate solution from Einstein Analytics that hosts both R and Python - see answer below. Not a feature-rich direct competitor to Shiny, but that's not our use-case.
According to the documentation for Salesforce Einstein Analytics Plus (aka Tableau CRM AI Analytics), data scientists can upload (operationalize) their Python, R, and Matlab code, as described here:
https://www.tableau.com/solutions/ai-analytics (see the section on "Data Science" at the bottom of the page).
I signed up for a trial of Einstein Analytics Plus, and found a link to the "Model Manager." Using Model Manager to deploy Python-language models is well-documented here:
https://help.salesforce.com/s/articleView?id=sf.bi_edd_model_upload_prepare.htm&type=5
For Python, this seems to match the advertised functionality. But there is no indication of how to deploy R language models, which may be part of my team's use case.
I would like to find the equivalent method for deploying an R-language model in Einstein. Particularly, is there some other Salesforce / Tableau product I should try, or is this a feature that is simply not available in the trial version. Unlike Python deployment, searching the documentation has not yielded answers.
Alternatively, we're only interested in Einstein R support is because it appears to be about 1/10 the cost of Shiny, which is hideously expensive. So any recommendations regarding lightweight alternatives to Shiny would also be helpful.
TIA for anyone who can shine a light on this problem.
ANSWER: There is actually a separate feature in Tableau that is different from Einstein Analytics which supports both R and Python, documentation here:
https://help.tableau.com/current/prep/en-us/prep_scripts.htm
Has anyone done a comparison between R connect Server and Power BI. We are trying to work on the benefits of R Connect server over Power BI in order to convince our super strict IT management to go with R Connect Server.
Thank you.
You should figure out what decision variables are important to you. This RStudio thread goes into detail about the benefits, mostly if you are going lightweight it is better. Most likely your users are more technical and want more ability to build powerful tools themselves.
Power BI seems to be better for the Excel "power" users. It does not handle large datasets well, and most likely this is for a non-technical crowd.
Consider the end users before all else, then work backward from there.
I just started using GCP for Data Science and I'm a bit overwhelmed with all the different tools. I have customer data in BigQuery which I'd like to analyze further for customer segmentation purposes.
However, I am not allowed to download it or have any copies locally. Most of the R+BigQuery tutorials I've seen seem to do that through. I am currently looking into analyzing the data using DataLab but there I can't seem to use R, only Python.
What would be a safe and cheap (!) way to analyze BigQuery data (< 100GBs) in R without downloading it? What GCP tool would be suited for it? Is there a way that does not include running R code in Python?
I have a Visual Studio 2010 Professional license so Ultimate is a long way off and all I want it for is load testing for a few weeks or so.
Is there a cheaper way of getting just the load testing part of VS2010 ultimate or an alternate load testing tool with decent metrics of how the load testing is doing? I'd like to know pages/sec, page speed, database connections and less important: CPU, memory, errors.
I have always found that the Radview Webload tool gives a good alternative but you will probably need to set up performance counters for CPU etc... and tie the data together yourself. Have you considered using an evaluation version of Ultimate to check if that would meet you needs? ;)
We developed a free Fiddler extension for load testing called StresStimulus. It replays sessions recorded in Fiddler under varied load levels and measures performance metrics. It is not nearly as complete as testing tools in Visual Studio 2010 Ultimate, but satisfies some of your requirements. You can get requests per/s and response times. The interface with PerfMon is not ready yet, so Windows performance counters should be tied manually.
Maybe you can set up Instrumentation on your app?
I am tasked with improving the performance of a particular page of the website that has an extremely high response time as reported by google analytics.
Doing a few google searches reveals a product that came with VS2003 called ACT (Application Center Test) that did load testing. This doesn't seem to be distributed any longer
I'd like to be able to get a baseline test of this page before I try to optimize it, so I can see what my changes are doing.
Profiling applications such as dotTrace from Jetbrains may play into it and I have already isolated some operations that are taking a while within the page using trace.
What are the best practices and tools surrounding performance and load testing? I'm mainly looking to be able to see results not how to accomplish them.
Here is an article showing how to profile using VSTS profiler.
If broken it is, fix it you should
Also apart from all the tools why not try enabling the "Health Monitoring" feature of asp.net.
It provides some good information for analysis. It emits out essential information related to process, memory, diskusage, counters etc. HM with VSTS loadtesting gives you a good platform for analysis.
Check out the below link..
How to configure HealthMonitoring?
Also, for reference to some checklist have a look at the following rules/tips from yahoo....
High performance website rules/tips
HttpWatch is also a good tool to for identifying specific performance issues.
HttpWatch - Link
Also have a look at some of the tips here..
10 ASP.NET Performance and Scalability secret
Take a look at the ANTS Profiler from Red Gate. I use a whole slew of the Red Gate products and am very satisfied!
There are a lot of different paths you can go down. Assuming a MS environment you can leverage some of the team system tools such as MS Team Tester to record tests and perform load testing against your site. These can be set to run as part of an automated build process.
A list of tools is located at: http://www.softwareqatest.com/qatweb1.html#LOAD
Now, you might start off simple. In this case install two firefox plugins: Firebug and YSlow for Firebug. These will give stats and point out issues such as page size, the number of requests made to get the page, etc. They will also make recommendations on some things to fix.
Further, you can use unit tests to execute a lot of the code behind to see what functions are hurting you.
You can do all sorts of testing if u have full MS dev system with TFS and Visual Studio Team Edition. Based on what I see here
I recently had a nice .Net bug which was running rampant. This tool sorta helped, but in your case, I could see it working nicely..
http://www.jetbrains.com/profiler/
Most of the time we've used WCAT from Microsoft. If your searches where bring up ACT then this is probably the tool you want to grab if you are looking for requests per second and the such. Mike Volodarsky has a good point pointing the way on how to grab this.
We use it quite a bit internally when it comes to testing our network infrastructure or new web application and it is incredibly flexible once you get going with it. And it seems every demo Microsoft has done for us with new web tech they seem to be busting out WCAT to show off the improvements.
It's command line driven so it's kinda old school, but if you want power and customization it can't be beat. Especially for free.
Now, we use DotTrace also on our own applications when trying to track down performance issues, and the RedGate tools are also nice. I'd definitely recommend a combination of the two of them. They both give you some pretty solid numbers to track down which part of your app is the slowdown and I can't imagine life without DotTrace.
Visual Studio Test Edition (2008 or 2010) comes with a very good load testing component for ASP.NET apps.
It allows you to get statistics for all the perfmon stats for a server (from basics like CPU and disk waits to garbage collection and SQL locks)
Create a load test for the page and run it, storing the stats in a database for the base line. Subsequent runs can be compared.