How to add a new field / column to an existing xBase file / table - xbase

I'm working on a legacy desktop application. It was written using Xbase++ from Alaska software. I'm just trying to add a new field to an existing db file but I can't find any documentation about how to do it.
I have looked at
https://harbour.github.io/doc/ , http://www.ousob.com/ng/clguide/index.php ,
https://en.wikibooks.org/wiki/Clipper_Tutorial:_a_Guide_to_Open_Source_Clipper(s) ,
http://www.alaska-software.com/support/kbase-old.cxp without any luck. All that is documented is about creating a new db file from scratch. Is it even possible to modify a db file structure?
cFieldExist := .f.
FOR nField := 1 TO (oDbfMaster:ProType)->( FCount() )
IF (oDbfMaster:ProType)->( FieldName( nField ) ) == 'newFieldName'
cFieldExist := .t.
ENDIF
NEXT
IF !cFieldExist
//Please help me here, I want to add a the new field 'newFieldName'
ENDIF

In the old days, using dBase or Clipper we used to open the table, copy the structure to a new table:
USE dbFile
COPY STRUCTURE EXTENDED TO tempFile
In the new table each row is a field from the original table. You append a new field and fill in the field name, data type, field length, no. of decimals, etc.
Then using the temp file, you create a new db file and append the records into it from your old db:
CREATE newFile FROM tempFile
USE newFile
APPEND FROM dbFile
Lastly, you need to rename the old file and then rename the new file to that name and recreate any indexes.

Related

2SXC Import Content Items - Hyperlink File Libraries

I've looked through the database best as I can, but I can't find a way to import files in a Hyperlink-Library for each content item. I see that 2sxc uses DNN's Filed and Folders table, but I cant see how the 2sxc content type's field links to the folder and the files.
Basically, I have about 400+ content items to import and about 6000+ linked files that need to be imported.
I figure it probably isn't possible to import the files directly from the XML file, but is it possible to write an sql script to link the files to the content item?
Thanks to #iJungleBoy, I have been able to create a process that can help automate the importing of file libraries into 2sxc content items.
The importing of content items follows the instructions at https://2sxc.org/en/Learn/Import-Export
Expanding upon his guidance, I created some MS SQL scripts to help do some of the heavy lifting.
Basically, we need to create the directory structure for the files in the ADAM folder and they need to be named specifically so that they are properly associated with each content item. The scripts rely upon an additional table that holds info on the files to be imported so that they can be correlated with the content items that have been previously imported into 2sxc.
Here is an SQL script that can be modified based on your needs:
-- Create function to return content items guid converted to base64
CREATE FUNCTION dbo.import2sxc_BinaryToBase64
(
#bin VARBINARY(MAX)
)
RETURNS VARCHAR(MAX)
AS
BEGIN
DECLARE #Base64 VARCHAR(MAX)
SET #Base64 = CAST(N'' AS XML).value('xs:base64Binary(xs:hexBinary(sql:variable("#bin")))', 'VARCHAR(MAX)')
RETURN #Base64
END
GO
-- Create function to compress guid for 2sxc
CREATE FUNCTION dbo.import2sxc_GuidCompress
(
#guidStr VARCHAR(36)
)
RETURNS VARCHAR(22)
AS
BEGIN
declare #guid uniqueidentifier = #guidStr
RETURN Substring(Replace(Replace(dbo.import2sxc_BinaryToBase64(#guid), '+', '-'),'/', '_'), 1, 22)
END
GO
-- Define the app name
DECLARE #appName nvarchar(255) = 'MyAppName'
-- Define the name of the content type
DECLARE #contentType nvarchar(150) = 'MyContentType'
-- Set the path for the adam files for the app
DECLARE #adamPath nvarchar(max) = 'c:\path\to\Portals\x\adam'
-- For importing images, get the name of the field that holds the id of the record from the original system
DECLARE #idFieldname nvarchar(50) = 'OriginalId'
-- Get the attribute set id for the content item
DECLARE #attributeSetId int
SELECT #attributeSetId = AttributeSetID FROM dbo.ToSIC_EAV_AttributeSets WHERE Name = #contentType
-- Get the attribute id
DECLARE #attributeId int
SELECT #attributeId = a.AttributeID
FROM dbo.ToSIC_EAV_Attributes a
INNER JOIN dbo.ToSIC_EAV_AttributesInSets ais on a.AttributeID = ais.AttributeID
WHERE ais.AttributeSetID = #attributeSetId AND StaticName = #idFieldname
-- Get all the content items, along with the compressed guid for the folder name, and generate the commands to create the direcctories
SELECT v.Value as SourceId, EntityGUID, dbo.import2sxc_GuidCompress(EntityGUID) as FolderName, 'mkdir "' + #adamPath + '\' + #appName + '\' + dbo.import2sxc_GuidCompress(EntityGUID) + '\Photos"' as cmdMkdir
FROM ToSIC_EAV_Entities e
INNER JOIN ToSIC_EAV_Values v ON e.EntityID = v.EntityID AND v.AttributeID = #attributeId
WHERE AttributeSetID = #attributeSetId
-- Create command to move files into the new folders
SELECT 'copy "' + f.Filename + '" "' + #adamPath + '\' + #appName + '\' + dbo.import2sxc_GuidCompress(EntityGUID) + '\Photos"' as cmdMove
FROM ToSIC_EAV_Entities e
INNER JOIN ToSIC_EAV_Values v ON e.EntityID = v.EntityID AND v.AttributeID = #attributeId
INNER JOIN import2sxc_Files f on v.Value = f.OriginalId
WHERE AttributeSetID = #attributeSetId
DROP FUNCTION dbo.import2sxc_BinaryToBase64
DROP FUNCTION dbo.import2sxc_GuidCompress
After the script is run, you will have columns named cmdMkdir and cmdMove that are the command line scripts you can run to create the folders and move the files into them as needed.
When the content items have been imported, and the scripts to create the folders and move the files have been run, you should clear the server cache in DNN and also go to the Site Assets (file manager) in DNN and refresh the ADAM folder and subfolders.
After you do that, all the files in the library for your content items should appear.
There is a way but it's a bit of a secret :)
Linking the images to the item (and the right field) happens automatically, when the items are in the folder dedicated to that field in ADAM. The schema is ca. like this [portal root]/adam/[app-name]/[entity-guid22]/[field-name]
Create one entry manually, and verify what you see. So you can basically import the data using the excel / xml import https://2sxc.org/en/Learn/Import-Export and then your biggest challenge will be to generate the guid22. This is a more compact form of the guid, which takes a long guid and re-encodes it using url-safe characters.
Theres a command in 2sxc which does this, basically
ToSic.Eav.Identity.Mapper.GuidCompress(original guid)
see also https://github.com/2sic/eav-server/blob/05d79bcb80c109d1ceb8422875a6ff7faa34ff4f/ToSic.Eav.Core/Identity/Mapper.cs

Insert data into columns of Excel sheet from SQL Server table

I have an Excel sheet with seven columns and two of those columns are empty. I have the parent table in SQL Server with several columns and complete data.
How can I fill these two empty columns by taking data from SQL Server table in ASP.NET MVC? Thanks in advance!
Have you looked at using a C# library such as NPOI or EPPlus to modify your Excel sheet?
Are you familiar with Entity Framework and how to use NuGet to install it in your project? You'll want to use EF along with LINQ to retrieve the data in your SQL table. It all depends on the flow of your application and what you want to trigger the update of the excel sheet but you'll likely want to have an action method inside your controller that responds to user interaction and then accesses the db data and writes it to the sheet. Or are you just looking to do a one time extraction and load of the data from SQL Server to Excel? Is this going to be a web app or just a basic console app?
EDIT (EPPlus example):
Here's a simple example I just put together that shows how to modify an Excel sheet using EPPlus:
var file = new FileInfo("C:\\Projects\\ConsoleApplication1\\test.xlsx");
if (file.Exists)
{
using (ExcelPackage ep = new ExcelPackage(file))
{
var workSheet = ep.Workbook.Worksheets[1];
workSheet.Cells[2, 4].Value = "test";
ep.Save();
}
}
In this example, I open the Excel file using FileInfo from System.IO and then assuming the file exists, create a new ExcelPackage object and pass in the FileInfo object as a parameter. I then access the first worksheet and access the cell in the 2nd row, 4th column and set the value to "test". Finally, I save the changes. As you can see, it's very straightforward to use. If you need further clarification or an expanded example, let me know.
Just remember to include System.IO and OfficeOpenXml (after you've installed the EPPlus package from NuGet):
using OfficeOpenXml;
using System.IO;
EDIT (Entity Framework guidance):
Assuming you're not using .NET Core, since you already have an existing database I recommend you use the simpler Database First approach in Entity Framework 6 to query and retrieve the data out of your SQL Server db. Follow the instructions listed here to set up your Entity Data Model within your web app. Once you've created the EDMX file and have your controller with your action method created, you're ready to write a LINQ query to get the data. I don't know the particulars of your db schema or how much data you're going to need to fill a column or whether the user is going to supply input as part of the query so it makes it a bit difficult to provide a specific example, but I can show you a generic example and then hopefully can run with it from there. Here is a quick and dirty example which combines querying the database with writing the data to an Excel sheet. In this example, I'm querying for all the Blue cars in the Cars table and then writing the Make and Model of each car in the result set out to the Excel spreadsheet (starting in the first row). I increment the row count as I go through the result set and this assumes that the Make is in the first column and the Model is in the 2nd column of the sheet.
var file = new FileInfo("C:\\Projects\\ConsoleApplication1\\test.xlsx");
if (file.Exists)
{
using (CarEntities dc = new CarEntities())
{
var carList = from c in dc.Cars
where c.Color == "Blue"
select c;
var rowCount = 1;
using (ExcelPackage ep = new ExcelPackage(file))
{
foreach (var car in carList)
{
var workSheet = ep.Workbook.Worksheets[1];
workSheet.Cells[rowCount, 1].Value = car.Make;
workSheet.Cells[rowCount, 2].Value = car.Model;
rowCount++;
}
ep.Save();
}
}
}

Sqlite: How to cast(data as TEXT) for BLOB

I have a sqlite database from which I want to extract a column of information with the datatype BLOB. I am trying this:
SELECT cast(data as TEXT) FROM content
This is obviously not working. The output is garbled text like this:
x��Uak�0�>�8�0Ff;I�.��.i׮%�A��s�M
The data in the content column is mostly text, but may also have images (which I recognized could cause a problem if I cast as TEXT). I simply want to extract that data into a usable format. Any ideas?
You can use
SELECT hex(data) FROM content
or
SELECT quote(data) FROM content
The first will return a hex string (ABCD), the second quoted as an SQL literal (X'ABCD').
Note that there's (currently) no way of converting hexadecimal column information back to a BLOB in SQLite. You will have to use C/Perl/Python/… bindings to convert and import those.
You can write some simple script which will save all blobs from your database into files. Later, you can take a look at these files and decide what to do with them.
For example, this Perl script will create lots of files in current directory which will contain your data blob fields. Simply adjust SELECT statement to limit fetched rows as you need:
use DBI;
my $dbh = DBI->connect("dbi:SQLite:mysqlite.db")
or die DBI::errstr();
my $sth = $dbh->prepare(qq{
SELECT id, data FROM content
});
$sth->execute();
while (my $row = $sth->fetchrow_hashref()) {
# Create file with name of $row->{id}:
open FILE, ">", "$row->{id}";
# Save blob data into this file:
print FILE $row->{data};
close FILE;
}
$dbh->disconnect();

SQLite Temp Table in Navicat

I use Navicat and this command to create temp table in sqlite:
create temp table search as select * from documents
Then when i try to query:
select * from search
I got:
no such table: temp.sqlite_master
or:
no such table
The table doesn't appear in table list too, but when I try to create it again I get:
table search already exists
What is the problem? is it from navicat?
You create statement looks correct to me. When you create a temp table it is deleted when you close the connection string used to create the table. Are you closing the connection after you create the table and then opening it again when you are sending the query?
If not, can you include your query statement too?
It's like a bug in SQLite DLL shipped with Navicat. Test it somewhere else worked ok.
Documentation of SQLite tells this about CREATE TABLE:
If a is specified, it must be either "main", "temp",
or the name of an attached database. In this case the new table is
created in the named database. If the "TEMP" or "TEMPORARY" keyword
occurs between the "CREATE" and "TABLE" then the new table is created
in the temp database. It is an error to specify both a
and the TEMP or TEMPORARY keyword, unless the is
"temp". If no database name is specified and the TEMP keyword is not
present then the table is created in the main database.
May be you should accesse table via temp prefix like this: temp.search.

Delete file when deleting row from table

On my website I use simple file management. User can upload files, see the list and delete them. In database I have one table Files which contain information about files (file name, description, insert date).
I display all the files in GridView control with SQLDataSource.
DeleteCommand="DELETE FROM Files WHERE id = #id"
What I want to do is to delete the asocciated file when user deletes row from table. I was trying to do this in OnDeleting event but it seams that I have to execute another SELECT to get file name. Is it the only way to do this? Or is it any other better way? Or how to get the file name from inside the OnDeleting event?
EDITED: Databse is SQL Server, but it is not important in this case. I store files in file system. In database there are only names of files.
Rather than deleting your row directly, create a stored procedure called something like deleteFile(#ID int). Inside this proc, get the filename:
Select FileName From Files Where ID = #ID
Then Delete the file suing whatever method you're using to delete actual files.
Then delete the row
Delete Files Where ID = #ID
If you try this from the GridView.RowDeleting event, you can use the passed-in parameter GridViewDeleteEventArgs to get the row about to be deleted. Assuming the name is in this row, you can then use it to delete the file.
I know they can be evil, but this seems like a place where triggers could be used, that way if the delete query is run from multiple places the file would still be deleted.
if it is MS SQL
DECLARE #Command varchar(8000)
SELECT #Command = 'del c:\' + DocID FROM Deleted
xp_cmdshell #Command
You need some way to cross reference the table entry with the file - presumably they both have a unique file name? If so, you could use that to delete the file from the same place in your code-behind that you are issuing the DB delete command.
Lukasz,
If you are using a Gridview to bind the data, you can define the DataKeyNames attribute and then in the RowDeleting event do the following:
string documentName = (string)GridName.DataKeys[e.RowIndex].Value;
DeleteDoc(documentName);

Resources