Get cell types when reading and parsing excel files - axapta

I am trying to read and parse and excel and some unclear things come into play as usual for me.
Here is what i have:
while (true)
{
comVariantCell1 = cells.item(row, 1).value().variantType();
comVariantCell2 = cells.item(row, 2).value().variantType();
//if an empty cell is found, processing will stop and user will get an error message in order to solve the inconsistency.
if (comVariantCell1 != COMVariantType::VT_EMPTY && comVariantCell2 != COMVariantType::VT_EMPTY)
{
//both cells have values, check their types.
importedLine = conNull();
progress1.setText(strfmt("Importing row %1", row));
if (cells.item(row, 1).value().variantType() == COMVariantType::VT_BSTR)
{
importedLine += cells.item(row, 1).value().bStr();
}
else
{
importedLine += cells.item(row, 1).value().double();
}
importedLine += cells.item(row, 2).value().double();
importedLinesCollection += [importedLine]; //conIns(importedLinesCollection, row - 1, (importedLine));
row++;
}
else
{
info (strFmt("Empty cell found at line %1 - import will not continue and no records were saved.", row));
break;
}
}
Excel format:
Item number Transfer Qty
a100 50.5
a101 10
a102 25
This worked well to check if the cell type is string: COMVariantType::VT_BSTR
but what should i use to check for a real or integer value ?
I am pretty sure in this case, the quantity will be not contain real values but anyway, it could be useful in the future to make the difference between these two types.
I have to mention that, even if i have an int value and I use cells.item(row, 1).value().int() it won't work. I can't see why.
Why do i want to make the difference? Because if it's forbidden to have real values in the quantity column ( at least in my case ), i want to check that and give the user the opportunity to put a correct value in that place and maybe further investigate why that happened to be there.

Take a look on how it is done in \Classes\SysDataExcelCOM\readRow.
It is basically using switch to test the type. This is really boring!
Also take a look on ExcelIO, a class I made some years ago. It reads Excel and returns each row as a container. This is a more high-level approach.
As a last resort you could save the Excel as a tab separated file. Then use TextIO to read the content. This will be at least 10 times faster than using Excel!

Related

Function for Google Sheets' Script editor with a button for TODAY(), and NOW() in two different columns of which are the next not blank in the column

Currently, I'm looking at some simple documentation for vague ways to make a 'button' (image) over a Google sheet to trigger a function on the script editor. I'm not familiar with this type of Syntax, I typically do AutoHotKey, and a bit of python.
All I want to do is have this button populate 2 columns. The current date in one, and the current time in the other (It doesn't even have to have its year or the seconds tbh). I don't know if it matters of what the pages name is based on how the script works. So the range is ( 'Log'!G4:H ).
Like if I were to make it for AutoHotkey I would put it as :
WinGet, winid ,, A ; <-- need to identify window A = active
MsgBox, winid=%winid%
;do some stuff
WinActivate ahk_id %winid%
So it affects any page it's active on.
I would like to use the same function on the same columns across different sheets. Ideally, that is. I don't care if I have to clone each a unique function based on the page, but I just can't even grasp this first step, lol.
I'm not too familiar with this new macro. If I use this macro does it only work for my client, because of say like it recording relative aspect ratio movements?
IE if I record a macro on my PC, and play it on my android. Will the change in the platform change its execution?
If anyone can point me in any direction as to any good documentation or resources for the Google Sheet Script Editor or its syntaxes I would really appreciate it.
EDIT: Just to clarify. Im really focused in on it being a function that populates from a click/press(mobile) of an image. I currently use an onEDIT on the sheet, and it wouldnt serve the purposes that I want for this function. Its just a shortcut to quickly input a timestamp, and those fields can still be retouched without it just reapplying a new function for a newer current time/date.
EDIT:EDIT: Ended up with a image button that runs a script that can only input to the current cell.
function timeStamp() {
SpreadsheetApp.getActiveSheet()
.getActiveCell()
.setValue(new Date());
}
It only works on the cell targeted.
I would like to force the input in the next availible cell in the column, and split the date from the time, and put them into cells adjacent from one another.
maybe this will help... if the 1st column is edited it will auto-print date in 2nd column and time in 3rd column on Sheet1:
function onEdit(e) {
var s = SpreadsheetApp.getActiveSheet();
if( s.getName() == "Sheet1" ) {
var r = s.getActiveCell();
if( r.getColumn() == 1 ) {
var nextCell = r.offset(0, 1);
var newDate = Utilities.formatDate(new Date(),
"GMT+8", "MM/dd/yyyy");
nextCell.setValue(newDate);
}
if( r.getColumn() == 1 ) {
var nextCell = r.offset(0, 2);
var newDate1 = Utilities.formatDate(new Date(),
"GMT+8", "hh:mm:ss");
nextCell.setValue(newDate1);
}}}
https://webapps.stackexchange.com/a/130253/186471

SQLite3 - how to know if the current row is the last row

This is trivial in mysql thanks to mysql_num_rows but no such equivalent is present in sqlite3. Hence the question is how to know if the current row is the last row.
Rearranging like following doesn't help as any previous binding after sqlite3_step is not valid.
sqlite3_prepare_v2()
int fetched = 0;
int last = 0;
while (sqlite3_step(statement) == SQLITE_ROW) {
// this is the previous row
if(fetched) {
process(data, last);
}
fetched = 1;
data = sqlite3_column_text();
}
last = 1;
if(fetched)
process(data, last);
Executing query twice (one with count) is a trivial solution but that's not what I am looking for.
Any ideas? thanks in advance.
SQLite computes the next output row only when needed.
So it is not possible to find out if you can get another row without actually trying to step to that row.
If your code really needs to know whether the current row is the last, you have to make a copy of all the data in the row.

Form Running Totals, Ax 2009

Is there an example anywhere of a form that performs running totals in a column located within a grid. The user ordering and filtering of the grid would affect the running totals column.
I can easily perform the above if it was ordering only by transaction date, but including the user ordering and filtering I presume that we would have to use the datasource range() and rangecount() functions (see SysQuery::mergeRanges() for an example) then iterate over these to apply the filtering, then include the dynalinks. The same for the ordering, albeit this is now more complicated.
Any suggestions appreciated. Any appreciations suggested (as in: vote the question up!).
You could implement it as a form datasource display method using this strategy:
Copy the form's datasource query (no need for SysQuery::mergeRanges):
QueryRun qr = new QueryRun(ledgerTrans_qr.query());
Iterate and sum over your records using qr, stop after the current record:
while (qr.next())
{
lt = qr.getNo(1);
total += lt.AmountMST;
if (lt.RecId == _lt.RecId)
break;
}
This could be made more performant if the sorting order was fixed (using sum(AmountMST) and adding a where constraint).
Return the total
This is of cause very inefficient (subquadratic time, O(n^2)).
Caching the results (in a map) may make it usable if there are not too many records.
Update: a working example.
Any observations or criticisms to the code below most welcome. Jan's observation about the method being slow is still valid. As you can see, it's a modification of his original answer.
//BP Deviation Documented
display AmountMST XXX_runningBalanceMST(LedgerTrans _trans)
{
LedgerTrans localLedgerTrans;
AmountMST amountMST;
;
localLedgerTrans = this.getFirst();
while (localLedgerTrans)
{
amountMST += localLedgerTrans.AmountMST;
if (localLedgerTrans.RecId == _trans.RecId)
{
break;
}
localLedgerTrans = this.getNext();
}
return amountMST;
}

Determining HTML5 database memory usage

I'm adding sqlite support to a my Google Chrome extension, to store historical data.
When creating the database, it is required to set the maximum size (I used 5MB, as suggested in many examples)
I'd like to know how much memory I'm really using (for example after adding 1000 records), to have an idea of when the 5MB limit will be reached, and act accordingly.
The Chrome console doesn't reveal such figures.
Thanks.
You can calculate those figures if you wanted to. Basically, the default limit for localStorage and webStorage is 5MB where the name and values are saved as UTF16 therefore it is really half of that which is 2.5 MB in terms of stored characters. In webStorage, you can increase that by adding "unlimited_storage" within the manifest.
Same thing would apply in WebStorage, but you have to go through all tables and figure out how many characters there is per row.
In localStorage You can test that by doing a population script:
var row = 0;
localStorage.clear();
var populator = function () {
localStorage[row] = '';
var x = '';
for (var i = 0; i < (1024 * 100); i++) {
x += 'A';
}
localStorage[row] = x;
row++;
console.log('Populating row: ' + row);
populator();
}
populator();
The above should crash in row 25 for not enough space making it around 2.5MB. You can do the inverse and count how many characters per row and that determines how much space you have.
Another way to do this, is always adding a "payload" and checking the exception if it exists, if it does, then you know your out of space.
try {
localStorage['foo'] = 'SOME_DATA';
} catch (e) {
console.log('LIMIT REACHED! Do something else');
}
Internet Explorer did something called "remainingSpace", but that doesn't work in Chrome/Safari:
http://msdn.microsoft.com/en-us/library/cc197016(v=VS.85).aspx
I'd like to add a suggestion.
If it is a Chrome extension, why not make use of Web SQL storage or Indexed DB?
http://html5doctor.com/introducing-web-sql-databases/
http://hacks.mozilla.org/2010/06/comparing-indexeddb-and-webdatabase/
Source: http://caniuse.com/

Full text search on a mobile device?

We'll soon be embarking on the development of a new mobile application. This particular app will be used for heavy searching of text based fields. Any suggestions from the group at large for what sort of database engine is best suited to allowing these types of searches on a mobile platform?
Specifics include Windows Mobile 6 and we'll be using the .Net CF. Also some of the text based fields will be anywhere between 35 and 500 characters. The device will operate in two different methods, batch and WiFi. Of course for WiFi we can just submit requests to a full blown DB engine and just fetch results back. This question centres around the "batch" version which will house a database loaded with information on the devices flash/removable storage card.
At any rate, I know SQLCE has some basic indexing but you don't get into the real fancy "full text" style indexes until you've got the full blown version which of course isn't available on a mobile platform.
An example of what the data would look like:
"apron carpenter adjustable leather container pocket waist hardware belt" etc. etc.
I haven't gotten into the evaluation of any other specific options yet as I figure I'd leverage the experience of this group in order to first point me down some specific avenues.
Any suggestions/tips?
Just recently I had the same issue. Here is what I did:
I created a class to hold just an id and the text for each object (in my case I called it a sku (item number) and a description). This creates a smaller object that uses less memory since it is only used for searching. I'll still grab the full-blown objects from the database after I find matches.
public class SmallItem
{
private int _sku;
public int Sku
{
get { return _sku; }
set { _sku = value; }
}
// Size of max description size + 1 for null terminator.
private char[] _description = new char[36];
public char[] Description
{
get { return _description; }
set { _description = value; }
}
public SmallItem()
{
}
}
After this class is created, you can then create an array (I actually used a List in my case) of these objects and use it for searching throughout your application. The initialization of this list takes a bit of time, but you only need to worry about this at start up. Basically just run a query on your database and grab the data you need to create this list.
Once you have a list, you can quickly go through it searching for any words you want. Since it's a contains, it must also find words within words (e.g. drill would return drill, drillbit, drills etc.). To do this, we wrote a home-grown, unmanaged c# contains function. It takes in a string array of words (so you can search for more than one word... we use it for "AND" searches... the description must contain all words passed in... "OR" is not currently supported in this example). As it searches through the list of words it builds a list of IDs, which are then passed back to the calling function. Once you have a list of IDs, you can easily run a fast query in your database to return the full-blown objects based on a fast indexed ID number. I should mention that we also limit the maximum number of results returned. This could be taken out. It's just handy if someone types in something like "e" as their search term. That's going to return a lot of results.
Here's the example of custom Contains function:
public static int[] Contains(string[] descriptionTerms, int maxResults, List<SmallItem> itemList)
{
// Don't allow more than the maximum allowable results constant.
int[] matchingSkus = new int[maxResults];
// Indexes and counters.
int matchNumber = 0;
int currentWord = 0;
int totalWords = descriptionTerms.Count() - 1; // - 1 because it will be used with 0 based array indexes
bool matchedWord;
try
{
/* Character array of character arrays. Each array is a word we want to match.
* We need the + 1 because totalWords had - 1 (We are setting a size/length here,
* so it is not 0 based... we used - 1 on totalWords because it is used for 0
* based index referencing.)
* */
char[][] allWordsToMatch = new char[totalWords + 1][];
// Character array to hold the current word to match.
char[] wordToMatch = new char[36]; // Max allowable word size + null terminator... I just picked 36 to be consistent with max description size.
// Loop through the original string array or words to match and create the character arrays.
for (currentWord = 0; currentWord <= totalWords; currentWord++)
{
char[] desc = new char[descriptionTerms[currentWord].Length + 1];
Array.Copy(descriptionTerms[currentWord].ToUpper().ToCharArray(), desc, descriptionTerms[currentWord].Length);
allWordsToMatch[currentWord] = desc;
}
// Offsets for description and filter(word to match) pointers.
int descriptionOffset = 0, filterOffset = 0;
// Loop through the list of items trying to find matching words.
foreach (SmallItem i in itemList)
{
// If we have reached our maximum allowable matches, we should stop searching and just return the results.
if (matchNumber == maxResults)
break;
// Loop through the "words to match" filter list.
for (currentWord = 0; currentWord <= totalWords; currentWord++)
{
// Reset our match flag and current word to match.
matchedWord = false;
wordToMatch = allWordsToMatch[currentWord];
// Delving into unmanaged code for SCREAMING performance ;)
unsafe
{
// Pointer to the description of the current item on the list (starting at first char).
fixed (char* pdesc = &i.Description[0])
{
// Pointer to the current word we are trying to match (starting at first char).
fixed (char* pfilter = &wordToMatch[0])
{
// Reset the description offset.
descriptionOffset = 0;
// Continue our search on the current word until we hit a null terminator for the char array.
while (*(pdesc + descriptionOffset) != '\0')
{
// We've matched the first character of the word we're trying to match.
if (*(pdesc + descriptionOffset) == *pfilter)
{
// Reset the filter offset.
filterOffset = 0;
/* Keep moving the offsets together while we have consecutive character matches. Once we hit a non-match
* or a null terminator, we need to jump out of this loop.
* */
while (*(pfilter + filterOffset) != '\0' && *(pfilter + filterOffset) == *(pdesc + descriptionOffset))
{
// Increase the offsets together to the next character.
++filterOffset;
++descriptionOffset;
}
// We hit matches all the way to the null terminator. The entire word was a match.
if (*(pfilter + filterOffset) == '\0')
{
// If our current word matched is the last word on the match list, we have matched all words.
if (currentWord == totalWords)
{
// Add the sku as a match.
matchingSkus[matchNumber] = i.Sku.ToString();
matchNumber++;
/* Break out of this item description. We have matched all needed words and can move to
* the next item.
* */
break;
}
/* We've matched a word, but still have more words left in our list of words to match.
* Set our match flag to true, which will mean we continue continue to search for the
* next word on the list.
* */
matchedWord = true;
}
}
// No match on the current character. Move to next one.
descriptionOffset++;
}
/* The current word had no match, so no sense in looking for the rest of the words. Break to the
* next item description.
* */
if (!matchedWord)
break;
}
}
}
}
};
// We have our list of matching skus. We'll resize the array and pass it back.
Array.Resize(ref matchingSkus, matchNumber);
return matchingSkus;
}
catch (Exception ex)
{
// Handle the exception
}
}
Once you have the list of matching skus, you can iterate through the array and build a query command that only returns the matching skus.
For an idea of performance, here's what we have found (doing the following steps):
Search ~171,000 items
Create list of all matching items
Query the database, returning only the matching items
Build full-blown items (similar to SmallItem class, but a lot more fields)
Populate a datagrid with the full-blow item objects.
On our mobile units, the entire process takes 2-4 seconds (takes 2 if we hit our match limit before we have searched all items... takes 4 seconds if we have to scan every item).
I've also tried doing this without unmanaged code and using String.IndexOf (and tried String.Contains... had same performance as IndexOf as it should). That way was much slower... about 25 seconds.
I've also tried using a StreamReader and a file containing lines of [Sku Number]|[Description]. The code was similar to the unmanaged code example. This way took about 15 seconds for an entire scan. Not too bad for speed, but not great. The file and StreamReader method has one advantage over the way I showed you though. The file can be created ahead of time. The way I showed you requires the memory and the initial time to load the List when the application starts up. For our 171,000 items, this takes about 2 minutes. If you can afford to wait for that initial load each time the app starts up (which can be done on a separate thread of course), then searching this way is the fastest way (that I've found at least).
Hope that helps.
PS - Thanks to Dolch for helping with some of the unmanaged code.
You could try Lucene.Net. I'm not sure how well it's suited to mobile devices, but it is billed as a "high-performance, full-featured text search engine library".
http://incubator.apache.org/lucene.net/
http://lucene.apache.org/java/docs/

Resources