record data type as an input in postgres functions - postgresql-9.1

How to take record data type as an input in postgres functions ?
If we can not take record data type as an input in postgres functions then is there any other way to do it ?
suppose I want to save an employee details like emp_id,emp_name,emp_address,emp_contactno etc into employee table. I want to take these values from user input and also I want to do it for multiple employees in a single function call. We need to send the input to the function as a list of records of employee.

Related

How to store data in firebase without typing the key values again and again?

Currently I'm making a firebase database. I'm new to firebase. When inserting data, we have to type key- value pairs right? But I have some data which require the same key values for the whole table. Like "name", "city" etc.
How can I save those key values in firebase, so that for every time a new record is inserting I don't have to type the key values?
There are no shortcuts for entering data in the console. It sounds like you should probably write a program to help make it easier to enter data.

Oracle APEX validate Input during Data Load with Transformation Rule

In APEX, when performing a Data Load (e.g. upload of a csv file into APEX application), is it possible to validate input data using a transformation rule?
For example, suppose to upload data about cars that have been sold this month.
The target table has the column car_manufacturer and num_car_sold.
The column car_manufacturer must accept only three values, say ('A1', 'A2', 'A3').
In a pseudo PLSQL, just to give an idea:
IF :car_manufacturer IN ('A1, A2, A3') then :car_manufacturer else <error>
How can I check this in the upload phase? Is it possible to use a transformation rule, in order that if it fails, it returns an error message? Other ways?
Thanks in advance.
You could put a constraint on the table definition as per the other answer, or if you only want the error message for when the Data Load is used, you can use a Table Lookup.
Go to Shared Components -> Data Load Definitions
Open the Data Load Definition that you want to edit
Create Table Lookup
Select the column (e.g. car_manufacturer)
Set the Table Lookup attributes to a table that contains the list of valid values (you'll need either a table or a view for this)
Leave Insert New Value set to No (If set to 'No' (the default) then a new record will not be created in the lookup table if the lookup column value(s) entered do not already exist. If set to 'Yes' then a record will be created in the lookup table using the upload column(s) and the Upload Key Column will be retrieved from the newly created record.)
Set Error Message to the message you want to return if a match is not found.
How about having a check constraint on the table for the column "car_manufacturer"?
ALTER TABLE TABLE_NAME
ADD CONSTRAINT CHECK_CAR_MANUFACTURER
CHECK ( CAR_MANUFACTURER in ('A1', 'A2', 'A3'));

How to restrict loading data based on some criteria using sql loader in oracle?

I have a data file (.csv) which contains 10lacs records. I am uploading file data into my table TBL_UPLOADED_DATA using oracle SQL LOADER and control file concept.
I am able to upload all the data form the file to table smoothly without any issues.
Now my requirement is i want to upload only relavant data based on some criteria.
for example i have table EMPLOYEE and its columns are EMPID,EMPNAME,REMARKS,EMPSTATUS
i have a datafile with employee data that i need to upload into EMPLOYEE table.
here i want restrict some data that should not upload into EMPLOYEE table using sql loader. Assume restriction criteria is like REMARKS should not contain 'NO' and EMPSTATUS should not contain '00'.
how can i implement this. Please suggest what changes to be done in control files.
You can use the WHEN syntax to choose to include or exclude a record based on some logic, but you can only use the =, != and <> operators, so it won't do quite what you need. If your status field is two characters then you can enforce that part with:
...
INTO TABLE employee
WHEN (EMPSTATUS != '00')
FIELDS ...
... and then a record with 00 as the last field will be rejected, with the log showing something like:
1 Row not loaded because all WHEN clauses were failed.
And you could use the same method to reject a record where the remarks are just 'NO' - where that is the entire content of the field:
WHEN (REMARKS != 'NO') AND (EMPSTATUS != '00')
... but not where it is a longer value that contains NO. It isn't entirely clear which you want.
But you can't use like or a function like instr or a regular expression to be more selective. If you need something more advanced you'll need to load the data into a staging table, or use an external table instead of SQL*Loader, and then selectively insert into your real table based on those conditions.

The field is too small to accept the amount of data you attempted to add

This is odd because I'm not inserting data, I'm pulling data with a query.
I'm trying to run
SELECT DISTINCT description FROM products;
Which outputs the error "The field is too small to accept the amount of data you attempted to add.".
However, running the following doesn't produce the error:
SELECT description FROM products;
So I'm confused as to what the issue would be.
I'm using OleDbDataReader and taking data out of an mdb database file.
This might be related to: http://support.microsoft.com/kb/896950/us
This problem occurs because when you
set the UniqueValues query property to
Yes, a DISTINCT keyword is added to
the resulting SQL statement. The
DISTINCT keyword directs Access to
perform a comparison between records.
When Access performs a comparison
between two Memo fields, Access treats
the fields as Text fields that have a
255-character limit. Sometimes Memo
field data that is larger than 255
characters will generate the error
message that is mentioned in the
"Symptoms" section. Sometimes only 255
characters are returned from the Memo
field.
Workaround:
To work around this problem, modify
the original query by removing the
Memo field. Then, create a second
query that is based on both the table
and the original query. This new query
uses all the fields from the original
query, and this new query uses the
Memo field from the table. When you
run the second query, the first query
runs. Then, this data is used to run
the second query. This behavior
returns the Memo field data based on
the returned data of the first query.

MS Access, Pass through query with complex criteria. Criteria include Select statments and vba functions

I currently have multiple queries that query data from a few tables linked through ODBC, and some temporary tables that are edited through the user interface. I have complex criteria in my queries such as:
SELECT * from ThingsData
WHERE (Thing In(SELECT Thing from ListOfThings) AND getThingFlag() = True);
In this case Thing is a field and ListOfThings is a temporary table that the user defines from the user interface. Basically, the user puts together a list of the field Thing that he/she wants to filter the data based on and I want to query only the data that matches the Thing values that the user adds to his/her list. Currently, the data I am querying is in the linked ODBC table, and the temp table ListOfThings is just a regular, local table and everything works peachy. I want to get rid of the linked table and use a pass through query instead. However, when i do that, unless the criteria is incredibly simplistic, i get an error:
"ODBC--Call Failed. Invalid object name ListOfThings."
If I dont have any criteria it works fine.
Long story short: In a pass through query, how do I apply criterias that include SELECTs and functions from my modules and just basically filter the pass through table based on data from my local tables?
What is at the other end of that ODBC link? In a pass-through query you will have to honor the syntax required by the database server, not Access syntax. I would first suspect that you can't have mixed case table names and I would try listofthings as the name.
If you have a tool that can be used to test queries directly against the database server, get the query working there and then simply cut and paste it into an Access pass-through query.

Resources