Connect Tableau Desktop to Unix server - unix

I have my SAS datasets on unix server and need to extract them from Tableau Desktop for reporting. They are very large in size and have space issues on my desktop to download them.
Could someone please help me with the steps required in connecting to Unix server?
Thanks,
CKP
I have tried Database connection from Tableau, but it didn't work

Tableau can connect to a variety of Statistical files including SAS files. Take a look at this link for more info.
The file from SAS will have to be brought to the local machine on which Tableau Desktop is running.
With regards to space issues, theres not much that Tableau can do to help there. Your Tableau Desktop license allows you to install on two computers - so if one doesn't have enough space - its worth finding another.

Related

ODBC link between MS Access and qGIS .gpkg data?

I am an MS Office veteran with self-taught basic GIS skills (Tatuk Editor), including use of SQLite-based layers that link to MS Access. In the past few years I've been learning to use qGIS, and for the most part, the experience has been very positive.
What hasn't been so great in the qGIS learning curve is my attempt to link a qGIS-created geopackage layer (using the SQlite ODBC driver) to an MS Access application for the express purpose of editing and, ideally, for programmatic updating of attribute fields in existing records. Yes, the gpkg table will link, but unfortunately the connection is read-only. The problem apparently stems from an rtree rigger in the underlying geodatabase that won't allow the edited or updated records to be written /saved.
At the recommendation of a friend who is more highly versed in these technicalities, I tried to resolve the 'no gpkg editing' problem by adding spatialite .dll files to the system folder and appropriate extensions in the ODBC set up box, all without success. I next dumped the 32 bit version of my Office 365 software and transitioned to the 64 bit version, which fortunately didn't faze my existing documents, databases, etc. but had no effect whatsoever on the 'no gpkg editing' problem. At the end of the day, I'm no closer to achieving the desired solution, i.e. an editable connection between Access and the gpkg table.
Without going into immense detail of the various steps I've tried, I will stop here and give folks an opportunity to respond. I'm hopeful that someone reading this has not only encountered the 'no gpkg editing' problem when linking to a geopackage with MS Access, but has also learned how to resolve the issue. If you are that person, please explain the process as best possible. If it simply can't be done, I would appreciate knowing that, as well.
I have the same exact problem. I downloaded the spatialite dlls and tried to put them in the same folder as the ODBC driver, and Sys32 other folders. No Dice. I tried using 32-bit and 64-bit driver, no dice. I tried the environment variable. No Dice.
I'm also an ArcGIS user who will miss being able to use Access Databases. Now that Pro can edit geopackages, we'd have a great option if we could edit the data in Access via ODBC. Frustrating!

R - Removing all ability to connect to the internet

I have been trying to make the case to have R installed at my place of employment. However, the IT department has come back with a risk assessment that R has potential risks. After much debating (and brick-head-interaction), I suggested removing all internet connectivity from R.
I know (hope?) in principle that it can be done, since 'base' R is open source and can be edited. My questions are:
How do I disable all internet possibility from 'base' R by editing the source code?
Once 1. is done, will the lack of internet flow on to packages? That is, will all internet be cut off from any package, no matter what the package is trying to do?
(Sorry, I'm a stats/maths guy, not so much a 'deep' programming dude.)
Wow, what a very strict work condition! Yeah, data analysis can be sometimes very dangerous. :-)
Generaly speaking it's possible to use R without internet connection, you just have to download the packages and install them from source (.zip/.tar.gz files).
But adjusting R source code would be unnecessary effort. I think that your IT department should be able to block access to internet connection in the firewall settings for R apps (R, RGui and/or RStudio), which takes only few minutes to set up. E.g. in Windows they can use Windows Defender Firewall with Advanced Security to block outbound connections from R exe files:
If they use another firewall or network rules, they should be able to set it up correctly and quickly as well.

New Azure Server - CSV Reader takes much longer

We have a asp.net website that allows users to import data from a CSV file. Recently we moved to a from a dedicated server to an Azure Virtual Machine and it is taking much longer. The hardware specs of the two systems are similar.
It used to take less than a minute for data to import now it can take 10 - 15 minutes. The original file upload speed is fine it is looping through the data and organizing it in the SQL database that takes the time.
Why is the Azure VM with similar specs taking so much longer and what can I do to fix it?
Our database is using Microsoft SQL Server 2012 installed on the same VM as the website.
Very hard to make a comparison between the two environments. Was the previous environment virtualized? It might do with speed of the hard disks, the placement of the Sql Server files, or some other infrastructural setup (or simply the iron). I would recommend have a look into the performance of the machine under load (resource monitor). This kind of operation is usually both processor and i/o intense. This operation should be done in parallell as well.
Hth
//Peter

working on a remote R session

The R session that I am working on is on a remote cluster due to memory contraints and the data is stored remotely. I am therefore using notepad++ to edit my files and just paste them into my SSH session as I go along. What is the best way to integrate with the remote session to take advantage of code completion and other things available in editors like RStudio. Any best practice suggestions about working on remote connections? I imagine this must be the case for most R users who work with large data sets.
From: http://www.sciviews.org/_rgui/projects/Editors.html
The famous Vim editor now also provides syntax highlighting for R. You can get a plugin to integrate R here. For Windows, there is another plugin using R-DCOM here. There is an alternate, multiplatform, plugin here. Also look at the VIM web site, because there are other interesting resources there for R users (see here for an overview). There is also an R package to better integrate debugging facilites in Vim: adtdbg

what is network programming

I've completed a program using java under eclipse, i have converted my code to .jar file and proceed to converting the .jar file to .exe file, I've even created an installer in connection to my .exe file. Every time i use the program to encode data, all of the data are recorded in the folder where my .exe file is located (to my local hard drive), the program is working just fine. The problem is, the concept of my program is not only to use it on my computer(one of the working station in our LAN) but also to deploy it to our local area network where in, i can encode data from one of our computer stations(client) and save those data to our server's hard disk and not to the computer station(client) hard disk as my current program is doing. I've made some research on how to achieved the concept of encoding data from a working station and save the data to the server's hard disk, so far i came up with TCP/IP programing and network programing. My question is, am i in the right path? If i proceed with this part of java will I achieved my goal or is there any other way of achieving my concept. I'm not really asking anybody to teach me how to achieved this goal, i am merely asking if I'm in the right path(study TCP/IP and network programing) or if i should study other part of java to achieved this concept. Please consider my way of asking, i feel very elementary, to be honest i have very less idea on TCP/IP programing and if TCP/IP is the correct topic to study to achieved my concept. Hoping that someone could give me a tip on this matter. Thank you and more power to stackoverflow.
You can certainly achieve your solution using network programming by writing a client and server, but depending on your needs it may not be the simplest solution.
For example, if you use the program from only one workstation at a time, you can output to a file saved on a shared network drive such as a windows shared folder or using nfs.
If you can set up a shared drive or folder from your sever that is accessible to all the workstations you are using, you can simply alter your program to read and write the data file into the network-shared directory.
If this will not work for you, then go ahead with learning sockets programming.
I suggest looking through the official documentation to get started:
http://docs.oracle.com/javase/tutorial/networking/sockets/

Resources