Error on binding the grid with a list - asp.net

List<business.clspluginsprp> objprp = new List<business.clspluginsprp>();
business.clsplugins obj = new business.clsplugins();
for (Int32 i = 0; i < k.Length; i++)
{
Int32 z = Convert.ToInt32(k.GetValue(i));
objprp.Add(obj.fnd_plugins(z));
}
GridView2.DataSource = objprp;
GridView2.DataBind();
An error arrived which is as: The best overloaded method match for 'System.Collections.Generic.List.Add(business.clspluginsprp)' has some invalid arguments while the other error is : Argument 1: cannot convert from 'System.Collections.Generic.List' to 'business.clspluginsprp'

Try changing this:
objprp.Add(obj.fnd_plugins(z));
to this:
objprp.AddRange(obj.fnd_plugins(z));
Since it seems fnd_plugins returns a list in and of itself. This will also error if fnd_plugins does not return a List generic implementation. Posting the signature of fnd_plugins will help debug this.
Thanks.

Assuming fnd_plugins returns a List of clsplugins objects, then you want to use AddRange instead of Add. AddRange lets you add multiple values in one call.

Related

xBestIndex malfunction (passing non-literal parameters to table valued function)

I'm trying to implement a table valued function (as a SQLite virtual table).
It's a function that would take a string and return a table with all the words of the string.
If I call it with literal values like below, it works fine.
SELECT word FROM splitstring("abc def ghi")
If, however, I call it with a column from another table it doesn't work:
SELECT a.Name, word FROM article a, splitstring(a.Text)
The xBestIndex method gets called all right, but right after that, I get an exception from the ExecuteReader method. The exception message is "xBestIndex malfunction". The xFilter method does not get called because of the exception.
My xBestIndex implementation is simple, it just marks the parameter so I can see it in xFilter:
public override SQLiteErrorCode BestIndex(SQLiteVirtualTable table, SQLiteIndex index)
{
index.Outputs.ConstraintUsages.ElementAt(0).argvIndex = 1;
index.Outputs.ConstraintUsages.ElementAt(0).omit = 1;
return SQLiteErrorCode.Ok;
}
Am I'm doing something wrong or is it impossible to pass non-literal parameters to table valued functions?
Found the issue! I was using constraints that had usable=0. The BestIndex method gets called multiple times by SQLite, the second time with a non-usable constraint.
Here is the fixed body of the BestIndex method.
public override SQLiteErrorCode BestIndex(SQLiteVirtualTable table, SQLiteIndex index)
{
if (index.Inputs.Constraints.Count() != 2)
throw new ArgumentException("The generate_series function requires two integer (long) parameters!");
if (index.Inputs.Constraints.All(c=>c.usable == 1))
{
index.Outputs.ConstraintUsages.ElementAt(0).argvIndex = 1;
index.Outputs.ConstraintUsages.ElementAt(0).omit = 1;
index.Outputs.ConstraintUsages.ElementAt(1).argvIndex = 2;
index.Outputs.ConstraintUsages.ElementAt(1).omit = 1;
}
else
{
index.Outputs.IndexNumber = -1;
index.Outputs.EstimatedCost = double.MaxValue;
}
return SQLiteErrorCode.Ok;
}
Now I check the usable flag. When BestIndex gets called with a constraint with usable=0 I skip it i.e. return a high estimated cost for that index so it doesn't get used.

Increment multidimensional array in Java

I have a file to put in a multidimensional array. I have to put to [0] a date (long) and one of the dimensions must be incremented depending on the value of the second token.
Here's the code :
BufferedReader bufStatsFile = new BufferedReader(new FileReader(statsFile));
String line = null;
List<Long[]> stats = new ArrayList<Long[]>();
stats.add(new Long[11]);
int i = 0; // will be in a loop later
while((line = bufStatsFile.readLine()) != null) {
StringTokenizer st = new StringTokenizer(line,";");
while(st.hasMoreTokens()) {
stats.get(i)[0] = Long.parseLong(st.nextToken());
stats.get(i)[Integer.parseInt(st.nextToken())]++; // Here is the problematic line.
}
}
bufStatsFile.close();
But the incrementation doesn't work. Maybe it is because of my array which is probably not correct, but I didn't found another proper way to do that.
Ok. I have found and it was, of course, stupid.
The problem was in my array declaration. I did it like that :
List<Long[]> stats = new ArrayList<Long[]>();
stats.add(new Long[11]);
And then, I tried to increment an Object and not a long number.
So now, I just do it like this :
List<long[]> stats = new ArrayList<>();
stats.add(new long[11]);
And it's perfectly working.
Check that the elements in your file are numbers from 0 to 10. Why are you having a List if you are only manipulating the row 0?
Which exception are your code throwing away?

Linq2XML missing element

How do I modify the query below to properly handle the case where the "Summary" element is missing from one of the articles? Now when that happens I get an "Object reference not set to an instance of an object."
var articles = from article in xmlDoc.Descendants("Article")
select new {
articleId = article.Attribute("ID").Value,
heading = article.Element("Heading").Value,
summary = article.Element("Summary").Value,
contents = article.Element("Contents").Value,
cats = from cat in article.Elements("Categories")
select new {
category = cat.Element("Category").Value
}
};
The problem is that article.Element("Summary") returns null if the element is not found, so you get a NullReferenceException when you try to get the Value property.
To solve this, note that XElement also has an explicit conversion to string. This won't throw if the XElement is null - you will just get a null string reference.
So to solve your problem you can change this:
summary = article.Element("Summary").Value,
to this:
summary = (string)article.Element("Summary")

dynamically build Intersect statement ASP.NET

I would like to use the IEnumerable function Intersect() to combine a few list and get the similar integers from each list. The problem I'm faced with is that I don't know how many list I will need to compare.
Here is an example:
A{1,2,3,4}
B{1,2,3}
C{1,2}
results = A.Intersect(B).Intersect(C)
This works great, but the next time around I may have a D{1,2} next time I come across the function.
I'd like to use the Intersect method, but I'm open to new ideas as well.
If you are receivng the collections in a list, you could do this:
List<List<int>> lists = new List<List<int>>();
var result = lists[0].AsEnumerable();
for (int i = 0; i < lists.Count - 1; i++)
{
result = result.Intersect(lists[i + 1]);
}

Filehelpers ExcelStorage.ExtractRecords fails when first cell is empty

When the first cell of an excel sheet to import using ExcelStorage.ExtractRecords is empty, the process fail. Ie. If the data starts at col 1, row 2, if the cell (2,1) has an empty value, the method fails.
Does anybody know how to work-around this? I've tried adding a FieldNullValue attribute to the mapping class with no luck.
Here is a sample project that show the code with problems
Hope somebody can help me or point in some direction.
Thank you!
It looks like you have stumbled upon an issue in FileHelpers.
What is happening is that the ExcelStorage.ExtractRecords method uses an empty cell check to see if it has reached the end of the sheet. This can be seen in the ExcelStorage.cs source code:
while (CellAsString(cRow, mStartColumn) != String.Empty)
{
try
{
recordNumber++;
Notify(mNotifyHandler, mProgressMode, recordNumber, -1);
colValues = RowValues(cRow, mStartColumn, RecordFieldCount);
object record = ValuesToRecord(colValues);
res.Add(record);
}
catch (Exception ex)
{
// Code removed for this example
}
}
So if the start column of any row is empty then it assumes that the file is done.
Some options to get around this:
Don't put any empty cells in the first column position.
Don't use excel as your file format -- convert to CSV first.
See if you can get a patch from the developer or patch the source yourself.
The first two are workarounds (and not really good ones). The third option might be the best but what is the end of file condition? Probably an entire row that is empty would be a good enough check (but even that might not work in all cases all of the time).
Thanks to the help of Tuzo, I could figure out a way of working this around.
I added a method to ExcelStorage class to change the while end condition. Instead of looking at the first cell for empty value, I look at all cells in the current row to be empty. If that's the case, return false to the while. This is the change to the while part of ExtractRecords:
while (!IsEof(cRow, mStartColumn, RecordFieldCount))
instead of
while (CellAsString(cRow, mStartColumn) != String.Empty)
IsEof is a method to check the whole row to be empty:
private bool IsEof(int row, int startCol, int numberOfCols)
{
bool isEmpty = true;
string cellValue = string.Empty;
for (int i = startCol; i <= numberOfCols; i++)
{
cellValue = CellAsString(row, i);
if (cellValue != string.Empty)
{
isEmpty = false;
break;
}
}
return isEmpty;
}
Of course if the user leaves an empty row between two data rows the rows after that one will not be processed, but I think is a good thing to keep working on this.
Thanks
I needed to be able to skip blank lines, so I've added the following code to the FileHelpers library. I've taken Sebastian's IsEof code and renamed the method to IsRowEmpty and changed the loop in ExtractRecords from ...
while (CellAsString(cRow, mStartColumn) != String.Empty)
to ...
while (!IsRowEmpty(cRow, mStartColumn, RecordFieldCount) || !IsRowEmpty(cRow+1, mStartColumn, RecordFieldCount))
I then changed this ...
colValues = RowValues(cRow, mStartColumn, RecordFieldCount);
object record = ValuesToRecord(colValues);
res.Add(record);
to this ...
bool addRow = true;
if (Attribute.GetCustomAttribute(RecordType, typeof(IgnoreEmptyLinesAttribute)) != null && IsRowEmpty(cRow, mStartColumn, RecordFieldCount))
{
addRow = false;
}
if (addRow)
{
colValues = RowValues(cRow, mStartColumn, RecordFieldCount);
object record = ValuesToRecord(colValues);
res.Add(record);
}
What this gives me is the ability to skip single empty rows. The file will be read until two successive empty rows are found

Resources