I've looked through the database best as I can, but I can't find a way to import files in a Hyperlink-Library for each content item. I see that 2sxc uses DNN's Filed and Folders table, but I cant see how the 2sxc content type's field links to the folder and the files.
Basically, I have about 400+ content items to import and about 6000+ linked files that need to be imported.
I figure it probably isn't possible to import the files directly from the XML file, but is it possible to write an sql script to link the files to the content item?
Thanks to #iJungleBoy, I have been able to create a process that can help automate the importing of file libraries into 2sxc content items.
The importing of content items follows the instructions at https://2sxc.org/en/Learn/Import-Export
Expanding upon his guidance, I created some MS SQL scripts to help do some of the heavy lifting.
Basically, we need to create the directory structure for the files in the ADAM folder and they need to be named specifically so that they are properly associated with each content item. The scripts rely upon an additional table that holds info on the files to be imported so that they can be correlated with the content items that have been previously imported into 2sxc.
Here is an SQL script that can be modified based on your needs:
-- Create function to return content items guid converted to base64
CREATE FUNCTION dbo.import2sxc_BinaryToBase64
(
#bin VARBINARY(MAX)
)
RETURNS VARCHAR(MAX)
AS
BEGIN
DECLARE #Base64 VARCHAR(MAX)
SET #Base64 = CAST(N'' AS XML).value('xs:base64Binary(xs:hexBinary(sql:variable("#bin")))', 'VARCHAR(MAX)')
RETURN #Base64
END
GO
-- Create function to compress guid for 2sxc
CREATE FUNCTION dbo.import2sxc_GuidCompress
(
#guidStr VARCHAR(36)
)
RETURNS VARCHAR(22)
AS
BEGIN
declare #guid uniqueidentifier = #guidStr
RETURN Substring(Replace(Replace(dbo.import2sxc_BinaryToBase64(#guid), '+', '-'),'/', '_'), 1, 22)
END
GO
-- Define the app name
DECLARE #appName nvarchar(255) = 'MyAppName'
-- Define the name of the content type
DECLARE #contentType nvarchar(150) = 'MyContentType'
-- Set the path for the adam files for the app
DECLARE #adamPath nvarchar(max) = 'c:\path\to\Portals\x\adam'
-- For importing images, get the name of the field that holds the id of the record from the original system
DECLARE #idFieldname nvarchar(50) = 'OriginalId'
-- Get the attribute set id for the content item
DECLARE #attributeSetId int
SELECT #attributeSetId = AttributeSetID FROM dbo.ToSIC_EAV_AttributeSets WHERE Name = #contentType
-- Get the attribute id
DECLARE #attributeId int
SELECT #attributeId = a.AttributeID
FROM dbo.ToSIC_EAV_Attributes a
INNER JOIN dbo.ToSIC_EAV_AttributesInSets ais on a.AttributeID = ais.AttributeID
WHERE ais.AttributeSetID = #attributeSetId AND StaticName = #idFieldname
-- Get all the content items, along with the compressed guid for the folder name, and generate the commands to create the direcctories
SELECT v.Value as SourceId, EntityGUID, dbo.import2sxc_GuidCompress(EntityGUID) as FolderName, 'mkdir "' + #adamPath + '\' + #appName + '\' + dbo.import2sxc_GuidCompress(EntityGUID) + '\Photos"' as cmdMkdir
FROM ToSIC_EAV_Entities e
INNER JOIN ToSIC_EAV_Values v ON e.EntityID = v.EntityID AND v.AttributeID = #attributeId
WHERE AttributeSetID = #attributeSetId
-- Create command to move files into the new folders
SELECT 'copy "' + f.Filename + '" "' + #adamPath + '\' + #appName + '\' + dbo.import2sxc_GuidCompress(EntityGUID) + '\Photos"' as cmdMove
FROM ToSIC_EAV_Entities e
INNER JOIN ToSIC_EAV_Values v ON e.EntityID = v.EntityID AND v.AttributeID = #attributeId
INNER JOIN import2sxc_Files f on v.Value = f.OriginalId
WHERE AttributeSetID = #attributeSetId
DROP FUNCTION dbo.import2sxc_BinaryToBase64
DROP FUNCTION dbo.import2sxc_GuidCompress
After the script is run, you will have columns named cmdMkdir and cmdMove that are the command line scripts you can run to create the folders and move the files into them as needed.
When the content items have been imported, and the scripts to create the folders and move the files have been run, you should clear the server cache in DNN and also go to the Site Assets (file manager) in DNN and refresh the ADAM folder and subfolders.
After you do that, all the files in the library for your content items should appear.
There is a way but it's a bit of a secret :)
Linking the images to the item (and the right field) happens automatically, when the items are in the folder dedicated to that field in ADAM. The schema is ca. like this [portal root]/adam/[app-name]/[entity-guid22]/[field-name]
Create one entry manually, and verify what you see. So you can basically import the data using the excel / xml import https://2sxc.org/en/Learn/Import-Export and then your biggest challenge will be to generate the guid22. This is a more compact form of the guid, which takes a long guid and re-encodes it using url-safe characters.
Theres a command in 2sxc which does this, basically
ToSic.Eav.Identity.Mapper.GuidCompress(original guid)
see also https://github.com/2sic/eav-server/blob/05d79bcb80c109d1ceb8422875a6ff7faa34ff4f/ToSic.Eav.Core/Identity/Mapper.cs
Related
I'm working on a legacy desktop application. It was written using Xbase++ from Alaska software. I'm just trying to add a new field to an existing db file but I can't find any documentation about how to do it.
I have looked at
https://harbour.github.io/doc/ , http://www.ousob.com/ng/clguide/index.php ,
https://en.wikibooks.org/wiki/Clipper_Tutorial:_a_Guide_to_Open_Source_Clipper(s) ,
http://www.alaska-software.com/support/kbase-old.cxp without any luck. All that is documented is about creating a new db file from scratch. Is it even possible to modify a db file structure?
cFieldExist := .f.
FOR nField := 1 TO (oDbfMaster:ProType)->( FCount() )
IF (oDbfMaster:ProType)->( FieldName( nField ) ) == 'newFieldName'
cFieldExist := .t.
ENDIF
NEXT
IF !cFieldExist
//Please help me here, I want to add a the new field 'newFieldName'
ENDIF
In the old days, using dBase or Clipper we used to open the table, copy the structure to a new table:
USE dbFile
COPY STRUCTURE EXTENDED TO tempFile
In the new table each row is a field from the original table. You append a new field and fill in the field name, data type, field length, no. of decimals, etc.
Then using the temp file, you create a new db file and append the records into it from your old db:
CREATE newFile FROM tempFile
USE newFile
APPEND FROM dbFile
Lastly, you need to rename the old file and then rename the new file to that name and recreate any indexes.
In my Microsoft SQL Server Management if I use this query:
select * from Modules where ModuleTitle like '%MyOldString%'
I can find multiple results with different ModuleIDs like 1, 2, 5, 1257, etc.
So now, I want to backup this ModuleIDs, then change all the string "MyOldString" to "MyNewString" in all that Module Titles, how should I do ?
If change one of them, I can use:
update Modules set ModuleTitle = 'MyNewString' where ModuleID = 1257
But now, I only need to replace the string and all at one for all that in the search results, is it possible ? And I need to backup that ModuleIDs, in case I need to change them back.
You could update it with a replace query:
update Modules
SET ModuleTitle = REPLACE(ModuleTitle,'MyOldString','MyNewString')
WHERE ModuleTitle like '%MyOldString%'
;
First, you can easily change the string using the same WHERE expression as you do for searching.
select * from Modules where ModuleTitle like '%MyOldString%'
You can update using
update Modules set ModuleTitle = 'MyNewString'
where ModuleTitle like '%MyOldString%'
As far as backup, where do you want to back the records up to? Just title or the full record?
One option would be to write a trigger, and automatically back things up into a separate table (containing ID, old title, and an update date). The benefit of the trigger would be the records get logged every time the title changes, not just when your code does it.
CREATE TABLE Module_backup
(Module_ID int,
oldtitle VARCHAR(200),
updDate DATETIME DEFAULT getDate()
)
Add a trigger to the table. Basically, in the trigger, if both INSERTED and DELETED are populated (i.e. an update operation),
INSERT INTO Module_backup(Module_id,oldtitle)
SELECT Module_Id,ModuleTitle FROM DELETED
I need to create file date wise, though folder is creating now.
But I need to put date wise file in run time folder
String logfile = t.FileName.ToString() + "/" + LoginID + "DateTime.ToString("yyyyMMdd").txt";
t.FileName = logfile;
LoginID is the value taken from database and changes folder on fly, but need to create text file . I am stucked here.
Any Help ?
I have a sqlite database from which I want to extract a column of information with the datatype BLOB. I am trying this:
SELECT cast(data as TEXT) FROM content
This is obviously not working. The output is garbled text like this:
x��Uak�0�>�8�0Ff;I�.��.i%�A��s�M
The data in the content column is mostly text, but may also have images (which I recognized could cause a problem if I cast as TEXT). I simply want to extract that data into a usable format. Any ideas?
You can use
SELECT hex(data) FROM content
or
SELECT quote(data) FROM content
The first will return a hex string (ABCD), the second quoted as an SQL literal (X'ABCD').
Note that there's (currently) no way of converting hexadecimal column information back to a BLOB in SQLite. You will have to use C/Perl/Python/… bindings to convert and import those.
You can write some simple script which will save all blobs from your database into files. Later, you can take a look at these files and decide what to do with them.
For example, this Perl script will create lots of files in current directory which will contain your data blob fields. Simply adjust SELECT statement to limit fetched rows as you need:
use DBI;
my $dbh = DBI->connect("dbi:SQLite:mysqlite.db")
or die DBI::errstr();
my $sth = $dbh->prepare(qq{
SELECT id, data FROM content
});
$sth->execute();
while (my $row = $sth->fetchrow_hashref()) {
# Create file with name of $row->{id}:
open FILE, ">", "$row->{id}";
# Save blob data into this file:
print FILE $row->{data};
close FILE;
}
$dbh->disconnect();
On my website I use simple file management. User can upload files, see the list and delete them. In database I have one table Files which contain information about files (file name, description, insert date).
I display all the files in GridView control with SQLDataSource.
DeleteCommand="DELETE FROM Files WHERE id = #id"
What I want to do is to delete the asocciated file when user deletes row from table. I was trying to do this in OnDeleting event but it seams that I have to execute another SELECT to get file name. Is it the only way to do this? Or is it any other better way? Or how to get the file name from inside the OnDeleting event?
EDITED: Databse is SQL Server, but it is not important in this case. I store files in file system. In database there are only names of files.
Rather than deleting your row directly, create a stored procedure called something like deleteFile(#ID int). Inside this proc, get the filename:
Select FileName From Files Where ID = #ID
Then Delete the file suing whatever method you're using to delete actual files.
Then delete the row
Delete Files Where ID = #ID
If you try this from the GridView.RowDeleting event, you can use the passed-in parameter GridViewDeleteEventArgs to get the row about to be deleted. Assuming the name is in this row, you can then use it to delete the file.
I know they can be evil, but this seems like a place where triggers could be used, that way if the delete query is run from multiple places the file would still be deleted.
if it is MS SQL
DECLARE #Command varchar(8000)
SELECT #Command = 'del c:\' + DocID FROM Deleted
xp_cmdshell #Command
You need some way to cross reference the table entry with the file - presumably they both have a unique file name? If so, you could use that to delete the file from the same place in your code-behind that you are issuing the DB delete command.
Lukasz,
If you are using a Gridview to bind the data, you can define the DataKeyNames attribute and then in the RowDeleting event do the following:
string documentName = (string)GridName.DataKeys[e.RowIndex].Value;
DeleteDoc(documentName);