How can I possibly replicate data from Postgres version 9.1 to 14 using wal log shipping - postgresql-9.1

I have the following painful scenario -
I have a system running a standalone Postgresql 9.1 database at a client site in Europe - I cannot upgrade the db version
It is running file based offline wal archive replication which produces replication files like 0000000100000009000000F1 which I ftp to the states
I also have a database system running Postgresql 9.1 in the states which I want to replicate the Europe data into
My stateside production DB is running Postgresql 14 - this is where this data needs to eventually reside
The db in step 3 was setup because I cannot replicate from 9.1 to 14 - at least as far as I know, it cannot be done
A. Is there some way I can jump from step 2 to 4 (pg 9.1 to 14) without step 3?
B. If not, is there a less painful way to get the data from step 3 to step 4 without exporting and importing data - there will be PK collisions otherwise?
Thank you.
Osupa
I thought I could get the actual database ddl statements from the shipping log wal files and apply those manually to pg 14 - had done something like this in MS SQLserver moons years ago. No dice

Related

MariaDB LOAD XML much slower than MySQL

I am testing MariaDB for a possible replacement of a MySQL data warehouse. This data warehouse is rebuilt nightly from a legacy database.
The basic process is to generate XML documents/files from the legacy database and then use DROP TABLE, create table from DDL, LOAD XML LOCAL INFILE 'xml file'. A few of the XML files are 60-100 megabytes (about 300K rows). On MySQL these tables take a couple minutes. On MariaDB these tables take significantly longer (e.g. 83 megabyte XML file requires 16 minutes on MariaDB, MySQL less than 1 minute), and the times seem to grow exponentially with file size.
I have read and followed the KB topic How to Quickly Insert Data Into MariaDB and have tried the suggestions there with no real change. Since the MariaDB tables are dropped and recreated immediately before the LOAD XML LOCAL INFILE, several performance improvements should be triggered.
I have not tried LOCK TABLE yet.
What can I try to improve performance? Don't want to return to MySQL but this issue is a deal killer.
Environment is RHEL 8, MariaDB 10.5.16
Used DISABLE KEYS/LOAD.../ENABLE KEYS with no apparent benefit.
Increased max_allowed_packet_size with no effect.

Saving veins data in a SQLite database

Good morning everybody.
I'm using Veins 4.4 and Omnet++ v4.6.
Is it possible to save data collected through a Veins simulation (i.e. WaveShortMessage fields) in my SQLite DB?
thanks in advance
BR
SQLite support for OMNeT++ 5.1 is a work in progress. There will be a preview release before the holidays so if you can port your code OMNeT++ 5.1 (PRE2) you will be able to configure an SQLite vector manager (instead of the current text based one) which will write out the vector data to a SQLite database. A preliminary version was presented at the 2016 OMNeT++ Summit: https://summit.omnetpp.org/archive/2016/assets/pdf/OMNET-2016-Session_3-03-Presentation.pdf

LiveCycle table growing out of control

Product: LiveCycle V. 2.5
Database: SQL Server 2008
OS: Windows 2008R2
Host: jBoss 4.2
I am submitting files to the watch folder service to compose PDFs through DDX. The process runs fine but I am now getting errors that the table EDCATTRIBUTEVALUEENTITY is out of space, and when I went to check the database is around 300GB. I am not storing any documents in the database as far as I know.
How should I go about purging that table? I tried the purge tool but all my jobs have the status of "11" which according to the documentation isn't even a valid status.
Are you sure that it's the edcattributevalueentity table that's eating up all of the space? Is your GDS stored in the database?
You might want to take a look at http://michael.omnicypher.com/2013/03/livecycle-gds-demystified.html to ensure that it's not the GDS growing out of control.
Let us know how you make out with that article to start.

Copying company data from one server to another, Ax 2009

We will soon be migrating company data from our pre-production server to production with Microsoft Ax 2009. I believe that there are two ways to do this and would like to know of any issues that might arise with the second. This is migrating data across servers and not a simple case of duplicating company data on the same Ax 2009 service.
Use the Import / Export functionality. Administration -> Periodic -> Data Export/Import, create a new definition group and then export it. Recreate the same definition group on the second server then import. Here's an example how to do it for Ax 2012.
Export/Backup the SQL Server database from the first server and restore it over the second. We then would merely be eliminating our test and template company data from the second server. The Ax servers need to have the same patches and development layers installed first. I note that we would need to edit the ServerID value within SysServerConfig table afterwards.
Thanks
The Backup/Restore way to do it is at least 10 times faster.
There are some issues to be aware of though:
Ensure that references to the file systems are correct (e.g. Document parameters)
Server configuration setup
UAC cache file issue described here
Update 1:
For AX 2012 you will also need to have the same model store on the production server as well, this can be accompliced by using database backup/restore or using export/import of the object store as described here.
Update 2:
Update 1 is partly rubbish, both data and model store are copied in the SQL backup/restore. No need to synchronize the database afterwards! But it may come as a surprise.
What about this solution? How To Copy AOS instance from Server to another

Oracle 11 g release 2 sample schema

I installed Oracle 11g Release 2 on a windows 7 laptop and created a database using DBCA. In one screen there is an option to create sample schema but it was greyed out so I could not select it.
I searched the dbhome\demo\schema\human_resources directory but there is only one file in it -
hr_code.sql(It created triggers in hr schema). There is no sql to create hr schema or populate schema tables. I searched the net to download the scripts but no idea where to get them. Can you help me
You can download the script from here.
Are you referring to the ASP.Net Membership / Role / etc schema? It's in <oraclehome><11g>\ASP.NET\SQL

Resources