eXist-db serialize is expand-xincludes=no ignored? - xquery

In eXist-db 4.4, Xquery 3.1, I am compressing a number of XML files to a .zip in a directory. The compression process uses serialize().
The XML files have some large xincludes which according to the documentation are automatically processed in serializing. I have attempted to 'turn off' the xinclude serialization in two places in the code (prologue declare and map), but the serializer is still outputting all xincludes:
declare option exist:serialize "expand-xincludes=no";
declare function zip:get-entries-for-zip()
{
(: get documents prefixed by 'MS609' :)
let $pref := "MS609"
(: get list of document names :)
let $doclist := xmldb:get-child-resources($globalvar:URIdata)[starts-with(., $pref)]
(: output serialized entries :)
let $entries :=
for $n in $doclist
return
<entry name="{$n}" type='text' method='store'>
{serialize(doc(concat($globalvar:URIdata, "/", $n)), map { "method": "xml", "expand-xincludes": "no"})}
</entry>
return $entries
};
The XML data with xincludes to reproduce this problem can be found here http://medieval-inquisition.huma-num.fr/downloads under the description "BM MS609 Edition (tei-xml)".
Many thanks in advance.

The expand-xincludes serialization parameter is specific to eXist and, as such (or at least at present), cannot be set using the fn:serialize() function. Instead, use the util:serialize() function:
util:serialize($document, "expand-xincludes=no")
Alternatively, since you're ultimately interested in zipping the contents of a collection, you can skip the explicit serialization step, declare your serialization options in the query's prolog (or set it inline using util:declare-option()), and simply provide the compression:zip() function the URI path(s) to the collections/documents you want to zip. For example:
xquery version "3.1";
declare option exist:serialize "expand-xincludes=no";
let $sources := "/db/apps/my-app/my-data" (: or a sequence of paths to individual docs:) ! xs:anyURI(.)
let $preserve-collection-structure := false()
let $zip := compression:zip($sources, $preserve-collection-structure),
return
xmldb:store("/db", "my-data.zip", $zip)
For more on serialization options in eXist, see my earlier answer to a similar question: https://stackoverflow.com/a/49290616/659732.

Related

eXist-db / XQuery compression:zip() of XML files saves text only

In eXist-db 4.4, XQuery 3.1, I am using automation to compress a number of xml files. The problem is that when they compress they are storing only the text content and not the xml content.
This function uses compression:zip to create a zip from a batch of documents:
declare option exist:serialize "expand-xincludes=no";
declare option exist:serialize "method=xml media-type=application/xml";
declare function zip:create-zip-by-batch()
{
[...]
let $zipobject := compression:zip(zip:get-entry-for-zip($x,false())
let $zipname := "foozipname.zip"
let $store := xmldb:store("/db/foodirectory", $zipname, $zipobject)
return $store
};
The above calls this function, where the documents are serialized and put into <entry> per documentation:
declare option exist:serialize "expand-xincludes=no";
declare option exist:serialize "method=xml media-type=application/xml";
declare function zip:get-entry-for-zip($x)
{
[...for each $foo document in $x, create an <entry>...]
let $serialized := serialize($foo, map { "method": "xml" })
let $entry =
<entry name="somefooname" type='xml' method='store'>
{$serialized}
</entry>
[...return a sequence of $entry...]
}
I think it's missing a configuration for serialization, but I can't figure it out...
Thanks in advance for any help.
Here a query for eXist demonstrating how to compress XML documents into a ZIP file and store it into one's database:
xquery version "3.1";
(: create a test collection with 10 test files: 1.xml = <x>1</x>
thru 10.xml = <x>10</x> :)
let $prepare := xmldb:create-collection("/db", "test")
let $populate := (1 to 10) ! xmldb:store("/db/test", . || ".xml", <x>{.}</x>)
(: construct zip-bound <entry> elements for the documents in the test collection :)
let $entries := collection("/db/test") !
<entry name="{util:document-name(.)}" type="xml" method="store">{
serialize(., map { "method": "xml" })
}</entry>
(: compress the entries and store in database :)
let $zip := compression:zip($entries, false())
return
xmldb:store("/db", "test.zip", $zip)
The resulting ZIP file contains the 10 test XML documents, intact. For a variant showing how to write the ZIP file to a location on your file system, see https://gist.github.com/joewiz/aa8d84500b1f1478779cdf2cc1934348.
For a fuller discussion of serialization options in eXist, see my answer to an earlier question: https://stackoverflow.com/a/49290616/659732.

Recursive copy of a folder with XQuery

I have to copy an entire project folder inside the MarkLogic server and instead of doing it manually I decided to do it with a recursive function, but is becoming the worst idea I have ever had. I'm having problems with the transactions and with the syntax but being new I don't find a true way to solve it. Here's my code, thank you for the help!
import module namespace dls = "http://marklogic.com/xdmp/dls" at "/MarkLogic/dls.xqy";
declare option xdmp:set-transaction-mode "update";
declare function local:recursive-copy($filesystem as xs:string, $uri as xs:string)
{
for $e in xdmp:filesystem-directory($filesystem)/dir:entry
return
if($e/dir:type/text() = "file")
then dls:document-insert-and-manage($e/dir:filename, fn:false(), $e/dir:pathname)
else
(
xdmp:directory-create(concat(concat($uri, data($e/dir:filename)), "/")),
local:recursive-copy($e/dir:pathname, $uri)
)
};
let $filesystemfolder := 'C:\Users\WB523152\Downloads\expath-ml-console-0.4.0\src'
let $uri := "/expath_console/"
return local:recursive-copy($filesystemfolder, $uri)
MLCP would have been nice to use. However, here is my version:
declare option xdmp:set-transaction-mode "update";
declare variable $prefix-replace := ('C:/', '/expath_console/');
declare function local:recursive-copy($filesystem as xs:string){
for $e in xdmp:filesystem-directory($filesystem)/dir:entry
return
if($e/dir:type/text() = "file")
then
let $source := $e/dir:pathname/text()
let $dest := fn:replace($source, $prefix-replace[1], $prefix-replace[2])
let $_ := xdmp:document-insert($source,
<options xmlns="xdmp:document-load">
<uri>{$dest}</uri>
</options>)
return <record>
<from>{$source}</from>
<to>{$dest}</to>
</record>
else
local:recursive-copy($e/dir:pathname)
};
let $filesystemfolder := 'C:\Temp'
return <results>{local:recursive-copy($filesystemfolder)}</results>
Please note the following:
I changed my sample to the C:\Temp dir
The output is XML only because by convention I try to do this in case I want to analyze results. It is actually how I found the error related to conflicting updates.
I chose to define a simple prefix replace on the URIs
I saw no need for DLS in your description
I saw no need for the explicit creation of directories in your use case
The reason you were getting conflicting updates because you were using just the filename as the URI. Across the whole directory structure, these names were not unique - hence the conflicting update on double inserts of same URI.
This is not solid code:
You would have to ensure that a URI is valid. Not all filesystem paths/names are OK for a URI, so you would want to test for this and escape chars if needed.
Large filesystems would time-out, so spawning in batches may be useful.
A an example, I might gather the list of docs as in my XML and then process that list by spawning a new task for every 100 documents. This could be accomplished by a simple loop over xdmp:spawn-function or using a library such as taskbot by #mblakele

MarkLogic - How to insert element into XML

How to insert the node in XML.
let $a := <a><b>bbb</b></a>)
return
xdmp:node-insert-after(doc("/example.xml")/a/b, <c>ccc</c>);
Expected Output:
<a><c>ccc</c><b>bbb</b></a>
Please help to get the output.
You should be using xdmp:node-insert-before I believe in the following way:
xdmp:document-insert('/example.xml', <a><b>bbb</b></a>);
xdmp:node-insert-before(fn:doc('/example.xml')/a/b, <c>ccc</c>);
fn:doc('/example.xml');
(: returns <a><c>ccc</c><b>bbb</b></a> :)
Nodes are immutable, so in-memory mutation can only be done by creating a new copy.
The copy can use the unmodified contained nodes from the original:
declare function local:insert-after(
$prior as node(),
$inserted as node()+
) as element()
{
let $container := $prior/parent::element()
return element {fn:node-name($container)} {
$container/namespace::*,
$container/attribute(),
$prior/preceding-sibling::node(),
$prior,
$inserted,
$prior/following-sibling::node()
}
};
let $a := <a><b>bbb</b></a>
return local:insert-after($a//b, <c>ccc</c>)
Creating a copy in memory and then inserting the copy is faster than inserting and modifying a document in the database.
Depending on how many documents are inserted, the difference could be significant.
There are community libraries for copying with changes, but sometimes it's as easy to write a quick function (recursive where necessary).
You can use below code to insert the element into the XML:
xdmp:node-insert-child(fn:doc('directory URI'),element {fn:QName('http://yournamesapce','elementName') }{$elementValue})
Here we use fn:QName to remove addition of xmlns="" in added node.

How to tidy-up Processing Instructions in Marklogic

I have a content which is neither a valid HTML nor a XML in my legacy database. Considering the fact, it would be difficult to clean the legacy, I want to tidy this up in MarkLogic using xdmp:tidy. I am currently using ML-8.
<sub>
<p>
<???†?>
</p>
</sub>
I'm passing this content to tidy functionality in a way :
declare variable $xml as node() :=
<content>
<![CDATA[<p><???†?></p>]]>
</content>;
xdmp:tidy(xdmp:quote($xml//text()),
<options xmlns="xdmp:tidy">
<assume-xml-procins>yes</assume-xml-procins>
<quiet>yes</quiet>
<tidy-mark>no</tidy-mark>
<enclose-text>yes</enclose-text>
<indent>yes</indent>
</options>)
As a result it returns :
<p>
<? ?†?>
</p>
Now this result is not the valid xml format (I checked it via XML validator) due to which when I try to insert this XML into the MarkLogic it throws an error saying 'MALFORMED BODY | Invalid Processing Instruction names'.
I did some investigation around PIs but not much luck. I could have tried saving the content without PI but this is also not a valid PI too.
That is because what you think is a PI is in fact not a PI.
From W3C:
2.6 Processing Instructions
[Definition: Processing instructions (PIs) allow documents to contain
instructions for applications.]
Processing Instructions
[16] PI ::= '' Char*)))?
'?>'
[17] PITarget ::= Name - (('X' | 'x') ('M' | 'm') ('L' |
'l'))
So the PI name cannot start with ? as in your sample ??†
You probably want to clean up the content before you pass it to tidy.
Like below:
declare variable $xml as node() :=
<content><![CDATA[<p>Hello <???†?>world</p>]]></content>;
declare function local:copy($input as item()*) as item()* {
for $node in $input
return
typeswitch($node)
case text()
return fn:replace($node,"<\?[^>]+\?>","")
case element()
return
element {name($node)} {
(: output each attribute in this element :)
for $att in $node/#*
return
attribute {name($att)} {$att}
,
(: output all the sub-elements of this element recursively :)
for $child in $node
return local:copy($child/node())
}
(: otherwise pass it through. Used for text(), comments, and PIs :)
default return $node
};
xdmp:tidy(local:copy($xml),
<options xmlns="xdmp:tidy">
<assume-xml-procins>no</assume-xml-procins>
<quiet>yes</quiet>
<tidy-mark>no</tidy-mark>
<enclose-text>yes</enclose-text>
<indent>yes</indent>
</options>)
This would do the trick to get rid of all PIs (real and fake PIs)
Regards,
Peter

To remove the node but keep the value inside intact through XQuery

I have a content.xml modelled as below
<root>
<childnode>
Some text here
</childnode>
</root>
I am trying to remove the <childnode> and update the content.xml with only the value of it
so the output looks like
<root>
Some Text here
</root>
I wrote a function to perform this but anytime I run it it gives me error as "unexpected token: modify". I was thinking of a way to accomplish this without using functx functions.
xquery version "1.0";
declare namespace request="http://exist-db.org/xquery/request";
declare namespace file="http://exist-db.org/xquery/file";
declare namespace system="http://exist-db.org/xquery/system";
declare namespace util="http://exist-db.org/xquery/util";
declare namespace response="http://exist-db.org/xquery/response";
declare function local:contentUpdate() {
let $root := collection('/lib/repository/content')//root/childNode
let $rmChild := for $child in $root
modify
(
return rename node $child as ''
)
};
local:updateTitle()
Thanks in advance
There are multiple problems with your query:
Updating functions must be declared as updating.
You're calling another function than you defined (probably you didn't notice as there still have been syntax errors).
Rename node expects some element (or processing instruction, attribute) as target, the empty string is not allowed.
At least BaseX doesn't allow updating statements when defining code as XQuery 1.0. Maybe exist doesn't care about this, try adding it if you need to know.
You do not want to rename, but replace all <childnode />s with its contents, use replace node.
This code fixes all these problems:
declare updating function local:contentUpdate() {
let $root := collection('/lib/repository/content')
return
for $i in $root//childnode
return
replace node $i with $i/data()
};
local:contentUpdate()
eXist-db's XQuery Update syntax is documented at http://exist-db.org/exist/update_ext.xml. Note that this syntax predates the release of the XQuery Update Facility 1.0, so the syntax is different and remains unique to eXist-db.
The way to do what you want in eXist-db is as follows:
xquery version "1.0";
declare function local:contentUpdate() {
let $root := doc('/db/lib/repository/content/content.xml')/root
return
update value $root with $root/string()
};
local:contentUpdate()
The primary changes, compared to your original code, are:
Inserted the eXist-db syntax for your update
Prepended '/db' to your collection name, as /db is the root of the database in eXist-db; replaced the collection() call with a doc() call, since you stated you were operating on a single file, content.xml
Changed //root to /root, since "root" is the root element, so the // (descendant-or-self) axis is extraneous
Replaced updateTitle() with the actual name of the function, contentUpdate
Removed the extraneous namespace declarations
For more on why I used $root/string(), see http://community.marklogic.com/blog/text-is-a-code-smell.

Resources