I just want to design new report in progress app builder.
Example:
I just want to pass employee number as a input parameter and get the respective employee details in excel sheet. for that how to write code in progress. please see the image.
Thanks,
Of course I've no idea how your db is actually organized but something like the following will export all of the "employee" records matching the employee number passed to the procedure. The output will be a CSV file suitable for Excel to open. It should give you some ideas about the syntax that you might like to start with.
define variable employeeNum as integer no-undo.
update employeeNum.
run exportEmployees ( input employeeNum ).
procedure exportEmployee:
define input parameter empNum as integer no-undo.
define buffer employee for employee.
output to value( "employee.csv" ).
for each employee no-lock where employee.employeeId = empNum:
export delimiter "," employee.
end.
output close.
return.
end.
A simple way to get data to excel is to use EXPORT.
Ie
OUTPUT TO file.csv.
FOR EACH Customer NO-LOCK:
EXPORT DELIMITER ";" Customer.
END.
OUTPUT CLOSE.
That will export without any formatting though.
Depending on your version of excel you can try com-handles to directly interact with it instead of opening a saved file.
http://knowledgebase.progress.com/articles/Article/21671
Your code would be similar to the example above, but you would change the /* Add Data */ section to something like the following.
FOR EACH EMP NO-LOCK
WHERE EMP.EMPNO = INTEGER(EMPNO:SCREEN-VALUE):
/* Add data */
ASSIGN chWorksheet:Range("B1"):VALUE = EMP.ENAME
chWorksheet:Range("B2"):VALUE = EMP.SAL.
END.
This assumes the fill-in is called "EMPNO" and your EMPNO field is an integer.
Related
I'm working with Progress-4GL, and in there I'm dealing with quite a large table (+130 field, +8000 tuples).
In order to view that table, I've decided to dump the entries of that table and import this to Excel. This seems quite easy: just use "Data Administration", menu "Admin", menu item "Dump data and definition", "Table contents" (file extension: *.d).
But there's a catch: in the resulting *.d file, the fields are separated using a space, but there are some spaces in some fields, making the result "unimportable" in Excel. Therefore I'd like to separate the fields by a character which is certainly not used in that table (the pipe character ("|") is perfect for that.
Does anybody know how I can tell my general framework (OpenEdge Desktop, release 11.6, not any "Studio" IDE) to use another character as a separator in the *.d table contents dump?
Most probably this is to be done, using an entry in the Progress.ini file (as the answer of this question), but I don't know which entry I'm dealing with.
It would also be nice to have the column headers in that *.d table contents dump, is that even possible?
Edit
Meanwhile I've found out that some of the character fields contain newlines, causing the table content dump to be distributed over different lines. This, of course, makes the whole thing unusable. Is there any way not to go to a new line in the *.d file?
Thanks in advance
My favorite data format is json, unfortunately Excel cannot read json out of the box, but it can read xml.
Unfortunately, the Progress ABL methods write-json and write-xml do not work on database buffers, so you need to create a temp-table like your database table, buffer-copy the records across and then write-xml on the temp-table.
def var ctable as char no-undo initial "customer".
def var hq as handle no-undo.
def var ht as handle no-undo.
def var hbtt as handle no-undo.
def var hbdb as handle no-undo.
create buffer hbdb for table ctable.
create temp-table ht.
ht:create-like( hbdb ).
ht:temp-table-prepare( ctable ).
hbtt = ht:default-buffer-handle.
create query hq.
hq:add-buffer( hbdb ).
hq:query-prepare( "for each " + hbdb:name ).
hq:query-open().
do while hq:get-next( no-lock ):
hbtt:buffer-create().
hbtt:buffer-copy( hbdb ).
end.
ht:write-xml( "file", ctable + ".xml", true ).
Note that I am not cleaning up after myself.
You can add a filter to the query if necessary.
You can omit fields in the copy if necessary.
Check Admin->Export Data->Text menu item in Data Administration tool. You can select "All" for "Fields to Export" option. The "Field Delimiter" can be left empty and choose "|" as "Field Separator" (or any character of your choice). The only thing where this might fail is when a field's value is distributed over different lines.
Also, there is a KB article about this. It might help: http://knowledgebase.progress.com/articles/Article/P8426
You will have to write code to do this yourself. The EXPORT statement is your friend.
You can use something like the below. This is basically what the Data Dictionary does.
OUTPUT TO 'customer.psv'. // choose your filename
FOR EACH Customer NO-LOCK:
EXPORT DELIMITER '|' Customer.
END.
OUTPUT CLOSE.
If you wanted to write this generically/dynamically there's quite a bit more to do, and you'll probably end up using buffer and buffer-field handles, and PUT UNFORMATTED .
I am trying to check if a table exists prior to send a SELECT query on that table.
The table name is composed with a trailing 2 letters language code and when I get the full table name with the user's language in it, I don't know if the user language is actually supported by my database and if the table for that language really exists.
SELECT name FROM sqlite_master WHERE name = 'mytable_zz' OR name = 'mytable_en' ORDER BY ( name = 'mytable_zz' ) DESC LIMIT 1;
and then
SELECT * FROM table_name_returned_by_first_query;
I could have a first query to check the existence of the table like the one above, which returns mytable_zz if that table exists or mytable_en if it doesn't, and then make a second query using the result of the first as table name.
But I would rather have it all in one single query that would return the expected results from either the user's language table or the english one in case his language is not supported, without throwing a "table mytable_zz doesn't exist" error.
Anyone knows how I could handle this ?
Is there a way to use the result of the first query as a table name in the 2nd ?
edit : I don't have the hand of the database itself which is generated automatically, I don't want to get involved in a complex process of manually updating any new database that I get. Plus this query is called multiple times and having to retrieve the result of a first query before launching a second one is too long. I use plain text queries that I send through a SQLite wrapper. I guess the simplest would rather be to check if the user's language is supported once for all in my program and store a string with either the language code of the user or "en" if not supported, and use that string to compose my table name(s). I am going to pick that solution unless someone has a better idea
Here is a simple MRE :
CREATE TABLE IF NOT EXISTS `lng_en` ( key TEXT, value TEXT );
CREATE TABLE IF NOT EXISTS `lng_fr` ( key TEXT, value TEXT );
INSERT INTO `lng_en` ( key , value ) VALUES ( 'question1', 'What is your name ?');
INSERT INTO `lng_fr` ( key , value ) VALUES ( 'question1', 'Quel est votre nom ?');
SELECT `value` FROM lng_%s WHERE `key` = 'question1';
where %s is to be replaced by the 2 letters language code. This example will work if the provided code is 'en' or 'fr' but will throw an error if the code is 'zh', in this case I would like to have the same result returned as with 'en' ....
Not in SQL, without executing it dynamically.. But if this is your front end that is running this SQL then it doesn't matter so much. Because your table name came out of the DB there isn't really any opportunity for SQL injection hacking with it:
var tabName = db.ExecuteScalar("SELECT name FROM sqlite_master WHERE name = 'mytable_zz' OR name = 'mytable_en' ORDER BY ( name = 'mytable_zz' ) DESC LIMIT 1;")
var results = db.ExecuteQuery("SELECT * FROM " + tabName);
Yunnosch's comment is quite pertinent; you're essentially storing in a table name information that really should be in a column.. You could consider making a single table and then a bunch of views like mytable_zz the definition of which is SELECT * FROM mytable WHERE lang = 'zz' etc, and make instead-of triggers if you want to cater for a legacy app that you cannot change; the legacy app would select from / insert into the views thinking they are tables, but in reality your data is single table and easier to manage
I've retrieved the field names of a table by giving the table name directly (INS_TEST is the name of my table). I used columns _Field-Name, _Data-Type of the _Field system table and retrieved the field names and their data types.
I want to use the retrieved field names and insert field values into those fields.
FOR EACH _File WHERE _File-Name = "INS_TEST":
FOR EACH _Field WHERE _File-Recid = RECID(_File):
DISPLAY _Field._Field-Name.
DISPLAY _Field._Data-Type.
ASSIGN _File._File-Name._Field._Field-Name = 1 WHEN (_Field._Data-Type EQ "INTEGER").
END.
END.
But the ASSIGN statement gives an error. Suggestions please!
The following procedure takes a table name, a field name and a character value as parameters. It will update the field in question in the first record of that table with the value provided.
Obviously you could write a more sophisticated WHERE clause and do other things to fancy it up for your specific needs.
procedure x:
define input parameter tbl as character no-undo.
define input parameter fld as character no-undo.
define input parameter xyz as character no-undo.
define variable qh as handle no-undo.
define variable bh as handle no-undo.
define variable fh as handle no-undo.
create buffer bh for table tbl.
create query qh.
qh:set-buffers( bh ).
qh:query-prepare( "for each " + tbl ).
qh:query-open.
do transaction:
qh:get-first( exclusive-lock ).
fh = bh:buffer-field( fld ).
display fh:buffer-value.
fh:buffer-value = xyz.
end.
delete object bh.
delete object qh.
return.
end.
run x ( "customer", "name", "fred" ).
/* prove that we really did change it...
*/
find first customer no-lock.
display name.
Don't use the _field table for this. _file and _field (in fact, any table that starts with underscore) are your meta schema tables. They're your friends in learning dynamic programming or even understanding how your schema is currently defined, but I strongly recommend AGAINST trying to manipulate them yourself. And it doesn't really sound like you're trying to do that anyway, if I understand you correctly.
So once you have ins_test fields, you can do a static assign in a block (again, no need to query or cycle the underscore tables in this case):
CREATE ins_test.
ASSIGN ins_test.field1 = value1
ins_test.field2 = value2
ins_test.field3 = value3 NO-ERROR.
IF ERROR-STATUS:ERROR THEN DO:
/* Treat your error here */
END.
Or, if you're really looking into dynamic assigning (which is going to be harder given you're probably still starting), you need to study dynamic queries and dynamic buffers so you understand those, then you could create the record by getting a handle to the buffer, then assign the table using the BUFFER-FIELD attribute to cycle the field names.
Hope that helps.
I have a SQlite database I'm trying to read with the QtSql.QSqlTableModel. The issue is it won't read any table where the field name contains a "." via the setTable method.
As an example if I have table called MyTable with the column names
(ID, Name.First, Name.Last)
I can manually select it with the query
SELECT * FROM MyTable
or
SELECT "ID", "Name.First", "Name.Last" and all is ok
However, the QSqlTableModel won't use that query but will error out with "no such column Name.First Unable to execute statement."
When I dug a little deeper the SQLITE driver in Qt would rewrite the query as
SELECT "ID", "Name"."First", "Name"."Last" FROM MyTable
But this SELECT statement is wrong and would try and grab columns from another table "Name" but I want a column called "Name.First" in the table "MyTable"
I tried to circumvent this by subclassing the setTable method which worked for getting the data into the TableView:
def tableName(self):
return self._tableName
def setTable(self, tableName):
self.clear()
self._tableName = tableName
self.setQuery(QtSql.QSqlQuery("SELECT * FROM {0}".format(tableName), self.database()))
However, reimplementing the method in this fashion broke the method submitAll().
Inside the File Save method I have the following:
ok = self.tableModel.submitAll()
if not ok:
logging.error('Error %s' % self.tableModel.lastError().text())
logging.error('Error %s' % self.tableModel.query().lastQuery())
return False
This gives this log:
ERROR:root:Error near "SET": syntax error Unable to execute statement
ERROR:root:Error SELECT * FROM MyTable
But when I don't reimplement the setTable method, submitAll() works without errors.
So... How do I circumvent the "." in the Column name problem and also have the submitAll() work?
BTW: I agree that having "." in the field names for SQL tables is not a good idea but this is pairing up with another tool that generates the sqlite file in this manner which I have no control over.
http://www.qtcentre.org/archive/index.php/t-7565.html
http://www.qtforum.org/article/11245/sqlite-how-to-insert-text-that-contains-character-in-field.html
Looks like you just need to call one or both of the functions below before sending it to the database, in order to sanitize the input.
http://qt-project.org/doc/qt-4.8/qsqlquery.html#bindValue
http://qt-project.org/doc/qt-4.8/qsqlquery.html#prepare
http://xkcd.com/327/
:)
Hope that helps.
I have a sqlite database from which I want to extract a column of information with the datatype BLOB. I am trying this:
SELECT cast(data as TEXT) FROM content
This is obviously not working. The output is garbled text like this:
x��Uak�0�>�8�0Ff;I�.��.i%�A��s�M
The data in the content column is mostly text, but may also have images (which I recognized could cause a problem if I cast as TEXT). I simply want to extract that data into a usable format. Any ideas?
You can use
SELECT hex(data) FROM content
or
SELECT quote(data) FROM content
The first will return a hex string (ABCD), the second quoted as an SQL literal (X'ABCD').
Note that there's (currently) no way of converting hexadecimal column information back to a BLOB in SQLite. You will have to use C/Perl/Python/… bindings to convert and import those.
You can write some simple script which will save all blobs from your database into files. Later, you can take a look at these files and decide what to do with them.
For example, this Perl script will create lots of files in current directory which will contain your data blob fields. Simply adjust SELECT statement to limit fetched rows as you need:
use DBI;
my $dbh = DBI->connect("dbi:SQLite:mysqlite.db")
or die DBI::errstr();
my $sth = $dbh->prepare(qq{
SELECT id, data FROM content
});
$sth->execute();
while (my $row = $sth->fetchrow_hashref()) {
# Create file with name of $row->{id}:
open FILE, ">", "$row->{id}";
# Save blob data into this file:
print FILE $row->{data};
close FILE;
}
$dbh->disconnect();