I am trying to change the default data directory of MonetDB. I am running out of space, and I would like to migrate the data to another folder.
Does anyone know how to do that?
I have installed MonetDB using the ubuntu package, and by default the data is stored in:
/var/lib/monetdb
I would welcome a solution that doesn't involve compiling MonetDB from source...
MonetDB data could reside anywhere. You could move the path to farm anywhere you like and just to provide the path to monetdbd while loading.
Refer: http://www.monetdb.org/Documentation/UserGuide/Tutorial
1) monetdbd create /path/to/mydbfarm
2) monetdbd start /path/to/mydbfarm
Related
I'm new to DataBricks. I am trying to access a .R file that is present in the DBFS storage but I cannot figure out how to do so. Any help is really appreciated.
I can read data from the storage using the file path /dbfs and also source code from the script but I want to make edits to the script.
You need some editor to do that - for example, you can setup RStudio on your cluster and connect to it via RStudio UI - in this case you can edit R files directly on DBFS.
But really, the simplest for you would be to use Databricks CLI fs command to copy the file to your local machine, make changes in the editor of your choice, and upload file back.
Earlier when using AzureML from the Notebooks blade of Azure ML UI, we could access the local files in AzureML using simple relative paths:
For example, in the above image to access the CSV from the test.ipynb we could just mention the relative path:
df = pandas.read_csv('WHO-COVID-19-global-data.csv')
However, we are not able to do that anymore.
Also when we run
import os
os.getcwd()
We see the output as
'/mnt/batch/tasks/shared/LS_root/mounts/clusters/<cluster-name>'.
Hence, we are unable to access the files in the FileStore which was not the case earlier.
When we run the same from the JuyterLab environment of the compute environment we get:
'/mnt/batch/tasks/shared/LS_root/mounts/clusters/<cluster-name>/code/Users/<current-user-name>/temp'.
We can easily solve it by adding the path '/code/Users/<current-user-name>/temp' at the base and use that instead. But this is not recommended as with a change in the environment we are using the code needs to change every time. How do we resolve this issue without going through this path appending method.
I work on the Notebooks team in AzureML, I just tried this. Did this just start happening today?
It seems like things are working as expected:
I am new to AEM, and would like to know how to take author instance into local instance. I can take the pushed changes made in project by using git command, then I can build the local by using mvn command. However I can't see the changes that I made in aem prod on local aem.
Can you help me regarding this. Thank you in advance!
If you are talking how you can have the changes made in pages in any of the environment to your local instance, then just create a content package in crxde in the environment from where you want the changes. After creating the package, download it and install in your local instance crxde.
The more direct approach is via the Package Manager, you create a package, enter the desired filters, then build and download. There are automated ways, by using the package manager with curl commands, Grabbit, etc. For local development probably you do not need to bring all the content, especially if the DAM is too big.
Windows seems to put R libraries in a onedrive directory by default if onedrive is in use. This is undesirable especially if you're using both R and onedrive on multiple computers with the same onedrive account.
How would I set my library to be put inside of C:\users<username>\documents instead of in C:\users<username>\onedrive\documents? There are good solutions here (How do I change the default library path for R packages), but they're mostly focused on solving this for a single windows account. Is there a general way to solve it for all accounts?
Every R installation has an etc/ directory with configuration, in it you can set Rprofile.site or, easier still, Renviron.site.
Files ending in .site should not get overwritten on the next install. Make sure you don't bulk delete though.
You can query where it is via R.home("etc"). On my (Linux) system:
> R.home("etc")
[1] "/usr/lib/R/etc"
>
Really excellent solution from here (https://github.com/r-windows/docs/issues/3):
just create an Renviron.site file in the /etc folder of your Rinstallation, then copy the following line to it:
R_USER=C:/Users/${USERNAME}/Documents
This sets R_USER which in turn sets R_LIBS_USER according to the user directory of each account under windows 10.
I have a Node-RED flow. It uses a sqlite node. I am using node-red-node-sqlite. My OS is Windows 10.
My sql database is configured just with name "db" :
My question is, where is located the sqlite database file?
I already search in the following places, but didn't found:
C:\Users\user\AppData\Roaming\npm\node_modules\node-red
C:\Users\user\.node-red
Thanks in advance.
Edit
I am also using pm2 with pm2-windows-service to start Node-RED.
If you don't specify a full path to the file in the Database field it will create the file in the current working directory for the process, which will be where you ran either node-red or npm start.
Use full path location with file name.
It should work i guess.
This isn't a valid answer, just a workaround for those who have the same problem.
I could't find my database file. But inside Node-RED everything worked just great. So. this is what I have done as a workaround:
In Node-RED, make some select nodes to get all data from tables
Store the tables values somewhere (in a .txt file or something like that)
Create your database outside Node-RED, somewhere like c:\sqlite\db.db. Check read/write permissions
Create the tables and insert the values stored from old database
In Node-RED, inside "Database", put the complete path of the database. For example, c:\sqlite\db.db
In my case this was easy because I only had two database with less than 10 rows.
Hope this can help others.
Anyway, still waiting for a valid answer :)