Importing legacy table to Drupal database - drupal

I have legacy database and tables that I would like to try to import in Drupal. Here's an example table structure :
Table : Projects
ProjectID
ProjectName
CountryID
TypeID
ProjectID is primary key, CountryID and TypeID are foreign keys which point to Countries and Type tables , respectively.
I think I would make a Projects content-type first, reflect the fields present in the legacy tables using CCK.. my only problem is to import the data.. Is there anyway to automate this?
Thanks!

If you can get the data into CSV/TSV format, Node Import should do the trick, and is geared towards site maintainers rather than developers.

The Migrate module handles importing from tables. Migrate has hooks for doing more complex imports, but you should be able to get your data simple enough that you don't need those hooks (which aren't very well documented) by creating a new table from a join of your existing tables. Something like this (untested):
CREATE TABLE combined SELECT * FROM Projects p
LEFT JOIN Country c ON c.CountryID = p.CountryID
LEFT JOIN Type t ON t.TypeID = p.TypeID
If you do decide you want to keep things more separated, with countries and types in separate content types, a coworker of mine wrote a pretty good tutorial on using migrate hooks.

Node import is fairly good if you just export the data as csv and import the foreign keys first. It works with complex fields like node references.
Otherwise you can write a basic module which goes through the database row by row and inserts the records into nodes. Some really basic pseudo code:
$node->title = $row['projectName'];
$node->type = 'project';
$node->country_field[0]['value'] = $row['country_name'];
if(save_node($node)) {
set_message('Imported node');
}
Drupal has database switching so you can switch between the databases.

Related

SQLite Importer will overwrite my database when I load my application?

I have an Ionic App using SQLite. I don't have any problems with implementation.
The issue is that I need to import an SQL file using SQLitePorter to populate the database with configuration info.
But also, on the same database I have user info, so my question is:
Everytime I start the app, it will import the sql file, fill the database and probably overwrite my user data too? Since it is all on the same base?
I assume that you can always init your table using string queries inside your code. The problem is not that you are importing a .sql file. Right?
According to https://www.sqlitetutorial.net/sqlite-create-table/ it is obvious that you always create a table with [IF NOT EXISTS] switch. Writing a query like :
CREATE TABLE [IF NOT EXISTS] [schema_name].table_name (
column_1 data_type PRIMARY KEY);
you let sqlite to decide if it's going to create a table with the risk to overwrite an existing table. It is supposed that you can trust that sqlite is smart enough, not to overwrite any information especially if you use 'BEGIN TRANSACTION' - 'COMMIT' procedure.
I give my answer assuming that you have imported data and user data in distinct tables, so you can manipulate what you populate and what you don't. Is that right?
What I usually do, is to have a sql file like this:
DROP TABLE configutation_a;
DROP TABLE configutation_b;
CREATE TABLE configutation_a;
INSERT INTO configutation_a (...);
CREATE TABLE configutation_b;
INSERT INTO configutation_b (...);
CREATE TABLE IF NOT EXIST user_data (...);
This means that every time the app starts, I am updating with the configuration data I have at that time (that's is why we use http.get to get any configuration file from a remote repo in the future) and create user data only if user_data table is not there (hopefully initial start).
Conclusion: It's always a good practice, in my opinion, to trust a database product 100% and abstractly let it do any transaction that might give you some risk if you implemented your self in your code; since it gives a tool for that.For example, the keyword [if not exists], is always safer than implementing a table checker your self.
I hope that helps.
PS: In case you refer in create database procedure, SQLite, connects to a database file and it doesn't exist, it creates it. For someone comfortable in sqlite command line, when you type
sqlite3 /home/user/db/configuration.db will connect you with this db and if the file is not there, it will create it.

Rails - Create and operate on a temporary table?

My background is in data science with R, but in my current position I'm pulling data through Rails and ActiveRecord. I want to perform transformations to my data and create new columns and save it in a temporary way that allows me to continue querying it like a regular table, but without actually making changes to the database.
In R, this might look something like:
new_table <- old_table[old_table$date >= '2020-01-01']
new_table$average <- mean(new_table$value)
I would take this new_table and perform any number of queries I could have done to the old_table, and once I close my app I expect this temporary table to be removed as well.
This particular transformation is simple and wouldn't require a new table, but for example, there are a number of tables I'd like to join with my new_table. It would be easier if I could perform my transformations once and then join it, rather than joining the old_table and performing the transformation each time.
Since your question is vague I'll give a general answer that might not fit your use but it's a best guess at this point. There are numerous ways to use the DB connection in Rails to query directly, as referenced in the link in my comments above. But as an experiment I wanted to see if this would work and it does, at least with a project that is using Postgres. I wanted it to be DB agnostic so I'm avoiding calling the DB connection directly...
First create a temporary class in the Rails console:
rails c
Loading development environment (Rails...
class MyTempTable < ActiveRecord::Base
end
=> nil
EDIT:
In addition to the method below, you can also do this to create the table:
MyTempTable.find_by_sql('create temp table temp_tables AS select...')
This will create the temp table directly from a query. You could then use a join statement if you wanted data from more than one table in the new temp table, and you can add any additional columns you want
End Edit
Now you have a class that will act like a table with the usual ActiveRecord methods. Rails now assumes there is a table in the DB called my_temp_tables (must be plural). You can then create a temp table (if your DBMS supports temp tables) like this:
MyTempTable.find_by_sql('create temp table my_temp_tables(col1, col2... ')
Now you have a temp table with the columns you want. You can then do SQL operations using
MyTempTable.find_by_sql('INSERT INTO my_temp_tables SELECT * FROM ....')
You can then treat MyTempTable like any other model in Rails. If you wanted all the columns from one table joined with some columns from another table you can create the temp table as above, you just have to create all the columns first (at least in Postgres, in MSSQL you can probably create the temp table inserting directly from a select => join statement). If you are new to Rails you can grab column names by doing this on existing tables:
some_columns = SomeTable.column_names
=> ["id", "name", "serial", "purchased", ...]
Now you have an array of the column names so you don't have to type all of them. You can list out the columns you want from the various tables, cut and past them into the create temp table... statement, then INSERT the joined data into MyTempTable
If you do much of this regularly you'll probably want to keep a listing of all your column names in an text file. You can also create Rake tasks that do all of this and save the data to some format, or send it off to where ever it is supposed to go. That way you can have it all in a file that you can just run and it will create the temp tables, do the work, and then when it closes out the temporary classes and tables go away.
You might want to investigate some Ruby Gems, there are probably existing gems that do some of what you want. But as a proof of concept this works. You could also spin up a local Rails app and use scripting to import the data you want into tables, then just flush and recreate it at will.
Any Rails gurus that know of a better way, please add an answer or edit this one. This is mostly a thought experiment for me since I wanted to see if it was possible.
If you want to create views that you can access later on you could use a gem like https://github.com/scenic-views/scenic
Or something like this might be of interest: https://github.com/igorkasyanchuk/rails_db
Sounds like you're keen on the benefits of having some structure and tools available to work on the data, but don't want the data persisted in a db table.
Maybe use a model without a table like this.

Oracle 11g data pump 10 column limit

I am using an Oracle data pump to do a schema "rename." There is a primary key column on all (2000) tables. For example, I need to run this on all tables:
update mytable set mykey='foo2' where mykey='foo';
I would use the remap_data option of expdp to do this. The problem is that there are some columns that I would need to do the rename on 10+ columns. Has anyone had a problem like this and found a way to handle this?
Previously, I had tried using "Create Table As." The problem would be having to recreate the schema structure for all of the tables (views/triggers/grants/indexes/constraints). I am aware of the DBMS_METADATA.GET_DDL package. Offhand, doing a diff of the database schema before and after and recreating the diffs seems ugly.
I have also tried doing inserts on the table without any constraints or indexes, so I would only have to re-enable constraints and recreate the indexes, but I would like to try something faster.
I am using Oracle 11.2.0.3.0.
If i understand correctly, your real problem (or goal) is to 'RENAME' a schema.
You chose to export / import (using a different NAME to achieve RENAME) using oracle data pump.
Then DROP old schema (if you feel redundant).
If this is correct, here are the steps, you can do to achieve your goal. I did it successfully on my DEV env. All objects (including PK, FKs) were imported successfully.
-- Export RMCORE_QA
expdp DIRECTORY=DMPDIR DUMPFILE=RMCORE_QA.dmp SCHEMAS='RMCORE_QA' LOG=RMCORE_QA_EXP_DP.lst
-- Import using RMCORE_QA3
impdp DIRECTORY=DMPDIR DUMPFILE=RMCORE_QA.dmp REMAP_SCHEMA='RMCORE_QA:RMCORE_QA3' SCHEMAS='RMCORE_QA' LOG=RMCORE_QA_IMP_DP.lst TRANSFORM=OID:N
You can also compare objects b/w schemas by-
SELECT OBJECT_NAME, STATUS, object_type FROM dba_objects WHERE owner LIKE 'RMCORE_QA'
MINUS
select OBJECT_NAME, STATUS, object_type from dba_objects where owner like 'RMCORE_QA3';
HTH. Let me know if i did not get your problem...

Database schema advice for storing form fields and field values

I've been tasked with creating an application that allows users the ability to enter data into a web form that will be saved and then eventually used to populate pdf form fields.
I'm having trouble trying to think of a good way to store the field values in a database as the forms will be dynamic (based on pdf fields).
In the app itself I will pass data around in a hash table (fieldname, fieldvalue) but I don't know the best way to convert the hash to db values.
I'm using MS SQL server 2000 and asp.net webforms. Has anyone worked on something similar?
Have you considered using a document database here? This is just the sort of problem they solve alot better than traditional RDBMS solutions. Personally, I'm a big fan of RavenDb. Another pretty decent option is CouchDb. I'd avoid MongoDb as it really isn't a safe place for data in it's current implementation.
Even if you can't use a document database, you can make SQL pretend to be one by setting up your tables to have some metadata in traditional columns with a payload field that is serialized XML or json. This will let you search on metadata while staying out of EAV-land. EAV-land is a horrible place to be.
UPDATE
I'm not sure if a good guide exists, but the concept is pretty simple. The basic idea is to break out the parts you want to query on into "normal" columns in a table -- this lets you query in standard manners. When you find the record(s) you want, you can then grab the CLOB and deserialize it as appropriate. In your case you would have a table that looked something like:
SurveyAnswers
Id INT IDENTITY
FormId INT
SubmittedBy VARCHAR(255)
SubmittedAt DATETIME
FormData TEXT
A few protips:
a) use a text based serialization routine. Gives you a fighting chance to fix data errors and really helps debugging.
b) For SQL 2000, you might want to consider breaking the CLOB (TEXT field holding your payload data) into a separate table. Its been a long time since I used SQL 2000, but my recollection is using TEXT columns did bad things to tables.
The solution for what you're describing is called Entity Attribute Value (EAV) and this model can be a royal pain to deal with. So you should limit as much as possible your usage of this.
For example are there fields that are almost always in the forms (First Name, Last Name, Email etc) then you should put them in a table as fields.
The reason for this is because if you don't somebody sooner or later is going to realize that they have these names and emails and ask you to build this query
SELECT
Fname.value fname,
LName.Value lname,
email.Value email,
....
FROM
form f
INNER JOIN formFields fname
ON f.FormId = ff.FormID
and AttributeName = 'fname'
INNER JOIN formFields lname
ON f.FormId = ff.FormID
and AttributeName = 'lname'
INNER JOIN formFields email
ON f.FormId = ff.FormID
and AttributeName = 'email'
....
when you could have written this
SELECT
common.fname,
common.lname,
common.email,
....
FROM
form f
INNER JOIN common c
on f.FormId = c.FormId
Also get off of SQL 2000 as soon as you can because you're going to really miss the UNPIVOT clause
Its also probably not a bad idea to look at previous SO EAV questions to give you an idea of problems that people have encountered in the past
I'd suggest mirroring the same structure:
Form
-----
form_id
User
created
FormField
-------
formField_id
form_id
name
value

Sqlite: Create a table and fill with information only once

I have an appcelerator titanium project (you don't need to be familiar with the platform to help me) that I am using to create an iOS app. I want to have a database with a table that is filled with several rows for the first time, and then left alone after the first time.
I know you can create a table if it doesn't exist. Is there something similar for Inserting data?
Thanks!
Expanding on MPelletier's response with some Titanium-specific code, you could do the following in your project:
var my_db = Ti.Database.open('nameofdb');
var my_result_set = my_db.execute('SELECT * FROM MyTable');
var records = my_result_set.rowCount;
The records variable will indicate whether or not you have data in your table and then you can act accordingly.
There are a couple of nice ORM-ish utilities out there for Titanium: TiStore and Joli are the two I've used. Both are inspired by ActiveRecord and can be helpful in reducing your DB-related code. They're on Github if you want to know more about them!
There's INSERT OR REPLACE if you want, but you might as well just check against the number of rows on the table:
SELECT COUNT(*) FROM MyTable;
Then decide to enter them on that.
You can create your DB and insert data in it via SQLLiteManager or whatever tool you want and then dump it.
Take the file you dumped, put it in your Titanium project folder (somewhere in the Resources folder).
Then this line of code will take the content of the .sqlite file and insert it in the iOS db:
var db = Ti.Database.install('../your-db.sqlite', 'your-db-name');
Just don't forget the CREATE IF NOT EXISTS statement in your SQL file.
Or use REPLACE instead of INSERT as MPelletier said.

Resources