I have installed Oracle XE 11g R2 on my machine. I ran few scripts which does the setup by creating schemas, procedures for our application. Now I want to clone this database so that other people by using the cloned dbf file can see the base schema on their respective machine and work on their individual requirement on top of that.
Now it has 6 dbf files
CONTROL.DBF
SYSAUX.DBF
SYSTEM.DBF
TEMP.DBF
UNDO.DBF
USER.DBF
Can i just give them the files or I need to create server parameter file (SPFILE) or Control file. What about the REDO logs.
I have very little knowledge in Database administration. Please suggest. I understand that it is not Enterprise Edition so all things might not supported but assuming cloning process is similar for XE.
While it is possible to restore a database using the data files, I strongly suspect that is not what you're really after. If you're not an experienced DBA, the number of possible issues you'll encounter trying to restore a backup on a different machine and then creating an appropriate database instance are rather large.
More likely, what you really want to do is generate a full export of your database. The other people that need your application would then install Oracle and import the export that you generated.
The simplest possible approach would be at a command line to
exp / as sysdba full=y file=myDump.dmp
You would then send myDump.dmp to the other users who would import that into their own database
imp / as sysdba full=y file=myDump.dmp
This will only be a logical backup of your database. It will not include things like the parameters that the database has been set to use so other users may be configured to use more (or less) memory or to have a different file layout or even a slightly different version of Oracle. But it does not sound like you need that degree of cloning. If you have a large amount of data, using the DataPump version of the export and import utilities would be more efficient. My guess from the fact that you haven't even created a new tablespace is that you don't have enough data for this to be a concern.
For more information, consult the Oracle documentation on the export and import utilities.
Removing content as it is not valid here
Related
I'm a beginner in Oracle and I want to know how I can uninstall Windows without losing the data saved before in my database system.
Another related question is:
To avoid this later can I install Oracle 11g in another partition different from the one contains Windows?
You can use datapump export to create an export file of your data. This can then be re-imported into another database.
Well it seems you have 2 partitions. Do a full export of your database (using exp or datapump) onto the other partition (let's say you called it drive D:). Then when you reinstall windows, you can import using imp or datapump.
You can also shutdown your database then copy all your datafiles over to drive D:. Be careful not to miss out the online redo files, and all datafiles, as well as your admin folder.
Then you can safely format the first partition (drive C:) and reinstall windows on drive C:. You can then install oracle on drive D: and create your database files there too. Note however if you reinstall windows, the RDBMS software will not work any more but your database will be intact on drive D:
I'm working on migration from Alfresco 4 to 5 and applying any add-ons on Alfresco 4 for the purpose is not applicable. Database used for the both versions are different from each other. I have tried with ACP files and it is very time consuming. Is there a size limitation on ACP files? What other methods can be used?
Use Standard Upgrade Procedure
What is your main intention? "Just" doing an upgrade from 4 to 5?
In that case the robust, easy way would be to:
Install required modules having custom models in your target sytstem (or if you customized models in the extension path than you have to copy that config)
backup and restore the alfresco repo database to your new (5.x) system. If your target system uses a different db product (not just a different version) you need to manage the db migration using db specific migration tools. It is no alternative to use Alfresco export/import.
sync alf_data/contentstore to your new system (make sure the db dump
is always older or you need to do an offline sync)
During startup Alfresco recognizes that the repo needs to be upgraded and does everything. Check the catalina.out for any output during migration.
If you need a subset from your previous system it is much easier to delete the content afterwards (don't forget to purge the trash and you should configure the cleaner job not to wait 14 days).
Some words concerning ACP
It is a nice tooling to export single directories but unfortunately it is limited:
no support accross Alfresco versions (exactly your case)
no support for site metadata / no site export/import (maybe it is working after the changes in 4.x when putting site metadata in nodes but I suppose nobody tested this)
must run in one transaction. So hard limits depend on your hardware / JVM configuration but I wouldn't recommend to export/import more than some thousand nodes at once.
If you really need to use export/import a huge number of documents you should use the import/export in a separate java process which means your Alfresco needs to be shut down. s. https://wiki.alfresco.com/wiki/Export_and_Import#Export_Tool
ACP does have a file limit (I can't remember the actual number), but we've had problems with ones below that limit too. We've given up on this approach in favor of using Alfresco bulk import tool.
One big advantage this tool has, it can continue a failed import from the point of failure, no need to delete the partially imported batch and start all over again. It can also update files as needed, something ACP method can't (would fail with DuplicateChildNameNotAllowed).
I have a quick question on oracle datapump. I have a small oracle database 11gR2 which contains different schemas (more than 8).I want to move this database to a new server and i am trying to use impdp/expdp method. I did a full export of the database under system user. THe new server also runs 11gR2.
Now if create a new database with same tablespaces on the new server, can i use a full import. Is this the recommended way to do it?
I know i can do it schema wise, but however it would require me creating the roles, and other supporting objects first and also identifying which schemas actually have objects on them.
I don't think there's such a thing as a "recommended" way to do it. You might as well ask what's the "recommended" data to put in an Oracle database. It kind of depends what your needs are.
The only problem that I've had with using a "full" import/export in the past, is that the export then also includes SYSTEM (and other default Oracle schemas) which you don't really want overwriting your new database. (Actually, this used to cause me some problems with the old imp/exp command -- but theoretically it would be the same problem with Data Pump).
Fortunately, Data Pump allows you to exclude certain objects from your export. When I do full exports, I tend to exclude all the schemas that are already created in the new database, at db-creation time. Include the following in your parameter file:
EXCLUDE=SCHEMA:" IN ('SYS','SYSTEM','WMSYS','OUTLN','MGMT_VIEW','XDB','ANONYMOUS','SYSMAN','ORDSYS','ORDSYS','ORDPLUGINS','SI_INFORMTN_SCHEMA','MDSYS','EXFSYS','DBSNMP','DMSYS','CTXSYS','DIP','TSMSYS','ORACLE_OCM')"
I have been using a central MS SQL database located in the cloud to develop a web site project. I have recently found myself in situations, when I need to develop without the internet connection. I want to begin to use a locally available copy of the existing database, put it in App_Data folder.
What is the correct set of steps I need to undertake to get the project to work with local DB?
For example:
Detach a db from an existing SQL instance.
Copy to a development machine.
etc.
Moving a SQL-Server DB is not that hard. Look here for some methods to do it.
http://support.microsoft.com/kb/314546
I usually find the sp_detach + sp_attach method really easy.
I would create an empty shell database locally, then use one of the many schema comparison solutions available to make the local database look exactly like the cloud database.
Correct way is to create and regularly update your standalone copy of database using import/export. In particular MS SQL Server provides Import/Export Wizard tool for such purpose.
Make a backup
Copy backup file
Restore on your server
Restore/organise users
I have been developing locally for some time and am now pushing everything to production. Of course I was also adding data to the development server without thinking that I hadn't reconfigured it to be Postgres.
Now I have a SQLite DB who's information I need to be on a remote VPS on a Postgres DB there.
I have tried dumping to a .sql file but am getting a lot of syntax complaints from Postgres. What's the best way to do this?
For pretty much any conversion between two databases the options are:
Do a schema-only dump from the source database. Hand-convert it and load it into the target database. Then do a data only dump from the source DB in the most compatible form of SQL dump it offers. Try loading that into the target DB. When you hit problems, script transformations to the dump using sed/awk/perl/whatever and try again. Repeat until it loads and the results match.
Like (1), hand-convert the schema. Then write a script in your preferred language that connects to both databases, SELECTs from one, and INSERTs into the other, possibly with some transformations of data types and representations.
Use an ETL tool like Talend or Pentaho to connect to both databases and convert between them. ETL tools are like a "somebody else already wrote it" version of (2), but they can take some learning.
Hope that you can find a pre-written conversion too. Heroku one called sequel that will work for SQLite -> PostgreSQL; is it available without Heroku and able to function without all the other Heroku infrastructure and code?
After any of those, some post-transfer steps like using setval() to initialize sequences is typically required.
Heroku's database conversion tool is called sequel. Here are the ruby gems you need:
gem install sequel
gem install sqlite3
gem install pg
Then this worked for me for a sqlite database file named 'tweets.db' in the current working directory:
sequel -C sqlite://tweets.db postgres://pgusername:pgpassword#localhost/pgdatabasename
PostgreSQL supports "foreign data wrappers", which allow you to directly access any data source through the DB, including sqlite. Even up to automatically importing the schema. You can then use create table localtbl as (select * from remotetbl) to get your data into the actual PG storage.
https://wiki.postgresql.org/wiki/Foreign_data_wrappers
https://github.com/pgspider/sqlite_fdw