How to import an xquery module using Saxon - xquery

I am having some troubles running an Xquery with Saxon9HE, which has a reference to an external module.
I would like Saxon to resolve the module with a relative path rather absolute.
the module declaration
module namespace common = "http://my-xquery-utils";
from the main xquery
import module namespace common = "http://my-xquery-utils" at "/home/myself/common.xquery";
from my java code
public class SaxonInvocator {
private static Processor proc = null;
private static XQueryEvaluator xqe = null;
private static DocumentBuilder db = null;
private static StaticQueryContext ctx = null;
/**
* Utility for debug, should not be called outside your IDE
*
* #param args xml, xqFile, xqParameter
*/
public static void main(String[] args) {
XmlObject instance = null;
try {
instance = XmlObject.Factory.parse(new File(args[0]));
} catch (XmlException ex) {
Logger.getLogger(SaxonInvocator.class.getName()).log(Level.SEVERE, null, ex);
} catch (IOException ex){
Logger.getLogger(SaxonInvocator.class.getName()).log(Level.SEVERE, null, ex);
}
System.out.print(transform(instance, args[1], args[2]));
}
public static String transform(XmlObject input, String xqFile, String xqParameter) {
String result = null;
try {
proc = new Processor(false);
proc.getUnderlyingConfiguration().getOptimizer().setOptimizationLevel(0);
ctx = proc.getUnderlyingConfiguration().newStaticQueryContext();
ctx.setModuleURIResolver(new ModuleURIResolver() {
#Override
public StreamSource[] resolve(String moduleURI, String baseURI, String[] locations) throws XPathException {
StreamSource[] modules = new StreamSource[locations.length];
for (int i = 0; i < locations.length; i++) {
modules[i] = new StreamSource(getResourceAsStream(locations[i]));
}
return modules;
}
});
db = proc.newDocumentBuilder();
XQueryCompiler comp = proc.newXQueryCompiler();
XQueryExecutable exp = comp.compile(getResourceAsStream(xqFile));
xqe = exp.load();
ByteArrayInputStream bais = new ByteArrayInputStream(input.xmlText().getBytes("UTF-8"));
StreamSource ss = new StreamSource(bais);
XdmNode node = db.build(ss);
xqe.setExternalVariable(
new QName(xqParameter), node);
result = xqe.evaluate().toString();
} catch (SaxonApiException e) {
e.printStackTrace();
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return result;
}
public static InputStream getResourceAsStream(String resource) {
InputStream stream = SaxonInvocator.class.getResourceAsStream("/" + resource);
if (stream == null) {
stream = SaxonInvocator.class.getResourceAsStream(resource);
}
if (stream == null) {
stream = SaxonInvocator.class.getResourceAsStream("my/project/" + resource);
}
if (stream == null) {
stream = SaxonInvocator.class.getResourceAsStream("/my/project/" + resource);
}
return stream;
}
}
If a change it into a relative path like
import module namespace common = "http://my-xquery-utils" at "common.xquery";
I get
Error on line 22 column 1
XQST0059: java.io.FileNotFoundException
I am not sure how the ModuleURIResolver should be used.

Saxon questions are best asked on the Saxon forum at http://saxonica.plan.io - questions asked here will probably be noticed eventually but sometimes, like this time, they aren't our first priority.
The basic answer is that for the relative URI to resolve, the base URI needs to be known, which means that you need to ensure that the baseURI property in the XQueryCompiler is set. This happens automatically if you compile the query from a File, but not if you compile it from an InputStream.
If you don't know a suitable base URI to set, the alternative is to write a ModuleURIResolver, which could for example fetch the module by making another call on getResourceAsStream().

Related

Confluent Kafka consumer consumes messages only after changing groupId

I have a .Net core console application, that uses Confluent.Kafka.
I build a consumer for consuming messages from specific topic.
the app is intended to run a few times every-day, consume the messages on the specified topic and process them.
It took me a while to understand the consumer's vehavior, but the it will consume messages only if its groupId is a one that was never in use before.
Every time I change the consumer's groupId - the comsumer will fetch the messages in the subscribed topic. But on the next runs - with same groupId - the consumer.Consume returns null.
This behvior seems rlated to rebalance between consumers on same group. But I don't understand why - since the consumer should exist only throughout the application liftime. Before leaving the app, I call to consumer.close() and consumer.Dispose(). These should destoy the consumer, so that on the next run, when I create the consumer, again it will be the first and single consumer on the specified groupId. But as I said, this is not what happens in fact.
I know there are messages on the topic - I check it via command-line. And I also made sure the topic has only 1 partition.
The most weird thing is, that I have another .net core console app, which does the same process - and with no issue at all.
I attach the codes of the 2 apps.
Working app - always consuming:
class Program
{
...
static void Main(string[] args)
{
if (args.Length != 2)
{
Console.WriteLine("Please provide topic name to read and SMTP topic name");
}
else
{
var services = new ServiceCollection();
services.AddSingleton<ConsumerConfig, ConsumerConfig>();
services.AddSingleton<ProducerConfig, ProducerConfig>();
var serviceProvider = services.BuildServiceProvider();
var cConfig = serviceProvider.GetService<ConsumerConfig>();
var pConfig = serviceProvider.GetService<ProducerConfig>();
cConfig.BootstrapServers = Environment.GetEnvironmentVariable("consumer_bootstrap_servers");
cConfig.GroupId = "confluence-consumer";
cConfig.EnableAutoCommit = true;
cConfig.StatisticsIntervalMs = 5000;
cConfig.SessionTimeoutMs = 6000;
cConfig.AutoOffsetReset = AutoOffsetReset.Earliest;
cConfig.EnablePartitionEof = true;
pConfig.BootstrapServers = Environment.GetEnvironmentVariable("producer_bootstrap_servers");
var consumer = new ConsumerHelper(cConfig, args[0]);
messages = new Dictionary<string, Dictionary<string, UserMsg>>();
var result = consumer.ReadMessage();
while (result != null && !result.IsPartitionEOF)
{
Console.WriteLine($"Current consumed msg-json: {result.Message.Value}");
...
result = consumer.ReadMessage();
}
consumer.Close();
Console.WriteLine($"Done consuming messages from topic {args[0]}");
}
}
class ConsumerHelper.cs
namespace AggregateMailing
{
using System;
using Confluent.Kafka;
public class ConsumerHelper
{
private string _topicName;
private ConsumerConfig _consumerConfig;
private IConsumer<string, string> _consumer;
public ConsumerHelper(ConsumerConfig consumerConfig, string topicName)
{
try
{
_topicName = topicName;
_consumerConfig = consumerConfig;
var builder = new ConsumerBuilder<string, string>(_consumerConfig);
_consumer = builder.Build();
_consumer.Subscribe(_topicName);
}
catch (System.Exception exc)
{
Console.WriteLine($"Error on ConsumerHelper: {exc.ToString()}");
}
}
public ConsumeResult<string, string> ReadMessage()
{
Console.WriteLine("ReadMessage: start");
try
{
return _consumer.Consume();
}
catch (System.Exception exc)
{
Console.WriteLine($"Error on ReadMessage: {exc.ToString()}");
return null;
}
}
public void Close()
{
Console.WriteLine("Close: start");
try
{
_consumer.Close();
_consumer.Dispose();
}
catch (System.Exception exc)
{
Console.WriteLine($"Error on Close: {exc.ToString()}");
}
}
}
}
Not working app - consuming only on first run after changing consumer groupId to one never in use:
class Program.cs
class Program
{
private static SmtpClient smtpClient;
private static Random random = new Random();
static void Main(string[] args)
{
try
{
var services = new ServiceCollection();
services.AddSingleton<ConsumerConfig, ConsumerConfig>();
services.AddSingleton<SmtpClient>(new SmtpClient("smtp.gmail.com"));
var serviceProvider = services.BuildServiceProvider();
var cConfig = serviceProvider.GetService<ConsumerConfig>();
cConfig.BootstrapServers = Environment.GetEnvironmentVariable("consumer_bootstrap_servers");
cConfig.GroupId = "smtp-consumer";
cConfig.EnableAutoCommit = true;
cConfig.StatisticsIntervalMs = 5000;
cConfig.SessionTimeoutMs = 6000;
cConfig.AutoOffsetReset = AutoOffsetReset.Earliest;
cConfig.EnablePartitionEof = true;
var consumer = new ConsumerHelper(cConfig, args[0]);
...
var result = consumer.ReadMessage();
while (result != null && !result.IsPartitionEOF)
{
Console.WriteLine($"current consumed message: {result.Message.Value}");
var msg = JsonConvert.DeserializeObject<EmailMsg>(result.Message.Value);
result = consumer.ReadMessage();
}
Console.WriteLine("Done sending emails consumed from SMTP topic");
consumer.Close();
}
catch (System.Exception exc)
{
Console.WriteLine($"Error on Main: {exc.ToString()}");
}
}
class ConsumerHelper.cs
using Confluent.Kafka;
using System;
using System.Collections.Generic;
namespace Mailer
{
public class ConsumerHelper
{
private string _topicName;
private ConsumerConfig _consumerConfig;
private IConsumer<string, string> _consumer;
public ConsumerHelper(ConsumerConfig consumerConfig, string topicName)
{
try
{
_topicName = topicName;
_consumerConfig = consumerConfig;
var builder = new ConsumerBuilder<string, string> (_consumerConfig);
_consumer = builder.Build();
_consumer.Subscribe(_topicName);
//_consumer.Assign(new TopicPartition(_topicName, 0));
}
catch (System.Exception exc)
{
Console.WriteLine($"Error on ConsumerHelper: {exc.ToString()}");
}
}
public ConsumeResult<string, string> ReadMessage()
{
Console.WriteLine("ConsumeResult: start");
try
{
return _consumer.Consume();
}
catch (System.Exception exc)
{
Console.WriteLine($"Error on ConsumeResult: {exc.ToString()}");
return null;
}
}
public void Close()
{
Console.WriteLine("Close: start");
try
{
_consumer.Close();
_consumer.Dispose();
}
catch (System.Exception exc)
{
Console.WriteLine($"Error on Close: {exc.ToString()}");
}
Console.WriteLine("Close: end");
}
}
}

How to call EJB from another app on the same server?

I have java SE sample client which run on desktop (code below). But I have access to WebSphere were called EJB is deployed. How to rewrite below code to work on WebSphere? (When I leave this code just like it is program works but I think this can be done more simple and clear)
Main method:
WSConn connection = new WSConn();
final Plan plan = connection.getPlanBean();
com.ibm.websphere.security.auth.WSSubject.doAs(connection.getSubject(), new java.security.PrivilegedAction<Object>() {
public Object run() {
try {
// App logic
} catch (Throwable t) {
System.err.println("PrivilegedAction - Error calling EJB: " + t);
t.printStackTrace();
}
return null;
}
}); // end doAs
WSConn class:
public class WSConn {
private static final String INITIAL_CONTEXT_FACTORY = "com.ibm.websphere.naming.WsnInitialContextFactory";
private static final String JAAS_MODULE = "WSLogin";
private static final String MODEL_EJB_NAME_LONG = "ejb/com/ibm/ModelHome";
private static final String PLAN_EJB_NAME_LONG = "ejb/com/ibm/PlanHome";
private Subject subject;
private InitialContext initialContext;
private String serverName;
private String serverPort;
private String uid;
private String pwd;
private String remoteServerName;
private Model modelBean;
private Plan planBean;
public WSConn() {
Properties props = new Properties();
try {
props.load(WSConn.class.getClassLoader().getResourceAsStream("WSConn.properties"));
} catch (IOException e) {
e.printStackTrace();
}
serverName = props.getProperty("WSConn.serverName");
serverPort = props.getProperty("WSConn.serverPort");
uid = props.getProperty("WSConn.userID");
pwd = props.getProperty("WSConn.password");
remoteServerName = props.getProperty("WSConn.remoteServerName");
}
private void init() {
if (subject == null || initialContext == null) {
subject = login();
}
}
private Subject login() {
Subject subject = null;
try {
LoginContext lc = null;
// CRATE LOGIN CONTEXT
Hashtable<String, String> env = new Hashtable<String, String>();
env.put(Context.INITIAL_CONTEXT_FACTORY, INITIAL_CONTEXT_FACTORY);
env.put(Context.PROVIDER_URL, "corbaloc:iiop:" + serverName + ":" + serverPort);
initialContext = new InitialContext(env);
// Just to test the connection
initialContext.lookup("");
lc = new LoginContext(JAAS_MODULE, new WSCallbackHandlerImpl(uid, pwd));
lc.login();
subject = lc.getSubject();
} catch (javax.naming.NoPermissionException exc) {
System.err.println("[WSConn] - Login Error: " + exc);
} catch (Exception exc) {
System.err.println("[WSConn] - Error: " + exc);
}
return subject;
}
public wModel getModelBean() {
if (modelBean == null) {
init();
modelBean = (wModel) com.ibm.websphere.security.auth.WSSubject.doAs(subject,
new java.security.PrivilegedAction<wModel>() {
public wModel run() {
wModel session = null;
try {
Object o = initialContext.lookup(MODEL_EJB_NAME_LONG);
wModelHome home = (wModelHome) PortableRemoteObject.narrow(o, wModelHome.class);
if (home != null) {
session = home.create(remoteServerName);
}
} catch (Exception exc) {
System.err.println("Error getting model bean: " + exc);
}
return session;
}
}); // end doAs
}
return modelBean;
}
public wPlan getPlanBean() {
if (planBean == null) {
init();
planBean = (wPlan) com.ibm.websphere.security.auth.WSSubject.doAs(subject,
new java.security.PrivilegedAction<wPlan>() {
public wPlan run() {
wPlan session = null;
try {
Object o = initialContext.lookup(PLAN_EJB_NAME_LONG);
wPlanHome home = (wPlanHome) PortableRemoteObject.narrow(o, wPlanHome.class);
if (home != null) {
session = home.create(remoteServerName);
}
} catch (Exception exc) {
System.err.println("Error getting plan bean: " + exc);
}
return session;
}
}); // end doAs
}
return planBean;
}
public Subject getSubject() {
if (subject == null) {
init();
}
return subject;
}
}
As indicated in another answer, the classic mechanism is to lookup and narrow the home interface.
Get the initial context
final InitialContext initialContext = new InitialContext();
Lookup for the home by jndi name, specifying either the full jndi name
Object obj = initialContext.lookup("ejb/com/ibm/tws/conn/plan/ConnPlanHome");
or you can create e reference in your WAR and use java:comp/env/yourname
Then narrow the home to the home interface class
ConnPlanHome planHome = (ConnPlanHome)PortableRemoteObject.narrow(obj, ConnPlanHome.class);
and then create the EJB remote interface
ConnPlan plan = planHome.create();
The about calls should work for IBM Workload Scheduler distributed.
For IBM Workload Scheduler z/OS the JNDI name and the class names are different:
final InitialContext initialContext = new InitialContext();
String engineName = "XXXX";
Object obj = initialContext.lookup("ejb/com/ibm/tws/zconn/plan/ZConnPlanHome");
ZConnPlanHome planHome = (ZConnPlanHome)PortableRemoteObject.narrow(obj, ZConnPlanHome.class);
ZConnPlan plan = planHome.create(engineName);
User credentials are propagated from the client to the engine, the client need to be authenticated otherwise the engine will reject the request.
If you're trying to access an EJB from a POJO class, then there is nothing more simple than lookup+narrow. However, if the POJO is included in an application (EAR or WAR), then you could declare and lookup an EJB reference (java:comp/ejb/myEJB), and then the container would perform the narrow rather than your code. If you change your code to be a managed class like a servlet, another EJB, or a CDI bean, then you could use #EJB injection, and then you would not even need a lookup.

Issue with Custom Pipeline component

The Custom Pipeline component developed reads the incoming stream to a folder and pass only some meta data through the MessageBox.I am using the one already availaible in Code Project
using System;
using System.Collections.Generic;
using System.Text;
using Microsoft.BizTalk.Message.Interop;
using Microsoft.BizTalk.Component.Interop;
using System.IO;
namespace SendLargeFilesDecoder
{
[ComponentCategory(CategoryTypes.CATID_PipelineComponent)]
[ComponentCategory(CategoryTypes.CATID_Decoder)]
[System.Runtime.InteropServices.Guid("53fd04d5-8337-42c2-99eb-32ac96d1105a")]
public class SendLargeFileDecoder : IBaseComponent,
IComponentUI,
IComponent,
IPersistPropertyBag
{
#region IBaseComponent
private const string _description = "Pipeline component used to save large files to disk";
private const string _name = "SendLargeFileDecoded";
private const string _version = "1.0.0.0";
public string Description
{
get { return _description; }
}
public string Name
{
get { return _name; }
}
public string Version
{
get { return _version; }
}
#endregion
#region IComponentUI
private IntPtr _icon = new IntPtr();
public IntPtr Icon
{
get { return _icon; }
}
public System.Collections.IEnumerator Validate(object projectSystem)
{
return null;
}
#endregion
#region IComponent
public IBaseMessage Execute(IPipelineContext pContext, IBaseMessage pInMsg)
{
if (_largeFileLocation == null || _largeFileLocation.Length == 0)
_largeFileLocation = Path.GetTempPath();
if (_thresholdSize == null || _thresholdSize == 0)
_thresholdSize = 4096;
if (pInMsg.BodyPart.GetOriginalDataStream().Length > _thresholdSize)
{
Stream originalStream = pInMsg.BodyPart.GetOriginalDataStream();
string srcFileName = pInMsg.Context.Read("ReceivedFileName", "http://schemas.microsoft.com/BizTalk/2003/file-properties").ToString();
string largeFilePath = _largeFileLocation + System.IO.Path.GetFileName(srcFileName);
FileStream fs = new FileStream(largeFilePath, FileMode.Create);
// Write message to disk
byte[] buffer = new byte[1];
int bytesRead = originalStream.Read(buffer, 0, buffer.Length);
while (bytesRead != 0)
{
fs.Flush();
fs.Write(buffer, 0, buffer.Length);
bytesRead = originalStream.Read(buffer, 0, buffer.Length);
}
fs.Flush();
fs.Close();
// Create a small xml file
string xmlInfo = "<MsgInfo xmlns='http://SendLargeFiles'><LargeFilePath>" + largeFilePath + "</LargeFilePath></MsgInfo>";
byte[] byteArray = System.Text.Encoding.UTF8.GetBytes(xmlInfo);
MemoryStream ms = new MemoryStream(byteArray);
pInMsg.BodyPart.Data = ms;
}
return pInMsg;
}
#endregion
#region IPersistPropertyBag
private string _largeFileLocation;
private int _thresholdSize;
public string LargeFileLocation
{
get { return _largeFileLocation; }
set { _largeFileLocation = value; }
}
public int ThresholdSize
{
get { return _thresholdSize; }
set { _thresholdSize = value; }
}
public void GetClassID(out Guid classID)
{
classID = new Guid("CA47347C-010C-4B21-BFCB-22F153FA141F");
}
public void InitNew()
{
}
public void Load(IPropertyBag propertyBag, int errorLog)
{
object val1 = null;
object val2 = null;
try
{
propertyBag.Read("LargeFileLocation", out val1, 0);
propertyBag.Read("ThresholdSize", out val2, 0);
}
catch (ArgumentException)
{
}
catch (Exception ex)
{
throw new ApplicationException("Error reading PropertyBag: " + ex.Message);
}
if (val1 != null)
_largeFileLocation = (string)val1;
if (val2 != null)
_thresholdSize = (int)val2;
}
public void Save(IPropertyBag propertyBag, bool clearDirty, bool saveAllProperties)
{
object val1 = (object)_largeFileLocation;
propertyBag.Write("LargeFileLocation", ref val1);
object val2 = (object)_thresholdSize;
propertyBag.Write("ThresholdSize", ref val2);
}
#endregion
}
}
The issue here is the LargeFileLocation is configurable in the receive pipeline. If I give a location for the first time for example E:\ABC\ the files are sent to the location.
But if I change the location to E:\DEF\ the files are still being sent to the previous location E:\ABC. I tried to create a new biztalk application deleting the old one but still I get the files dropped in to the old location E:\ABC\ not sure why.
Most likely the issue is with respect to Property definition of LargeFileLocation and its implementation and usage in IPersistPropertyBag interfaces. You can try following things:
Check if you have added E:\ABC path in Pipeline at design time. If
yes remove it from there and set in Admin console for first time
also and see how it behaves, my feeling is it will take temp path
location.
Change the Properties and IPersistPropertyBag implementation to use property with declaration such as public string LargeFileName {get;set;} i.e. no local variables _largeFileName.
Have you deleted the dll in %BizTalkFolder%\Pipeline Components\ ?
To refresh the pipeline component, you need delete the old dll file/remove the item in VS toolbox. then restart the VS, deploy it again.
and for this LargeFileLocation , I suggest you make it as a property so you can config it.

How to accept mine if any conflicts occur while pull-rebase in JGit

I have a piece of code that does pull with rebase:
private void pullWithRebase(Git git) throws GitAPIException {
git.checkout().setName("master").call();
List<Ref> branches = git.branchList().setListMode(ListBranchCommand.ListMode.ALL).call();
String remoteMasterBranchName = "refs/remotes/origin/master";
for (Ref ref : branches) {
if (remoteMasterBranchName.equals(ref.getName())) {
PullResult result = git.pull().setRemoteBranchName("master").setRebase(true).call();
return;
}
}
}
However it doesn't work if any conflicts occur while merging. If they do occur, I want to accept mine
I ended up just merging two branches and directly resolving any conflicts by modifying files.
private static final String CONFLICT_HEAD_MARKER = "<<<<<<<";
private static final String CONFLICT_BORDER_MARKER = "=======";
private static final String CONFLICT_UPSTREAM_MARKER = ">>>>>>>";
private boolean acceptHead;
public void resolve(File file) throws IOException {
File temp = new File(file.getParent(), "temp" + System.currentTimeMillis());
try(BufferedReader reader = new BufferedReader(new FileReader(file));
BufferedWriter writer = new BufferedWriter(new FileWriter(temp))) {
String currentLine;
boolean removePartition = false;
while((currentLine = reader.readLine()) != null) {
if (currentLine.contains(CONFLICT_HEAD_MARKER)) {
removePartition = !acceptHead;
continue;
} else if (currentLine.contains(CONFLICT_BORDER_MARKER)) {
removePartition = acceptHead;
continue;
} else if (currentLine.contains(CONFLICT_UPSTREAM_MARKER)) {
removePartition = false;
continue;
}
if (!removePartition) {
writer.write(currentLine + System.getProperty("line.separator"));
}
}
}
FileUtils.forceDelete(file);
FileUtils.moveFile(temp, file);
}

ServiceStack REST API path variables from root throwing exception

I am trying to write a REST web service using ServiceStack that accepts variable paths off of route. For example:
[Route("/{group}"]
public class Entity : IReturn<SomeType> {}
This throws a NotSupported Exception "RestPath '/{collection}' on type Entity is not supported". However, if I change the path as follows (along with the associated path in AppHost configuration) to:
[Route("/rest/{group}"]
It works just fine. In order to integrate with the system that I am working with, I need to use /{group}.
ServiceStack now allows you to add a fallback route from the / root path to handle any un-matched requests, that's not handled by a catch-all handler or refers to an existing static file. So in v3.9.56 you can now do:
[FallbackRoute("/{group}"]
public class Entity : IReturn<SomeType> {}
An alternative option is to register a IAppHost.CatchAllHandlers to handle un-matched routes, but you would need to return your own IHttpHandler to handle the request or alternatively return a RedirectHttpHandler to redirect to a different route that is managed by ServiceStack.
My current work in progress, a plugin to serve the default page to all 'not found' routes without changing the url in the browser, includes most of what you'll need to handle a global wildcard route. Use it to get you started.
To understand what this code is doing, it helps to understand ServiceStack's routing priority, and how CatchAllHandlers fit into the process. ServiceStack calls ServiceStackHttpHandlerFactory.GetHandler to get the handler for the current route.
ServiceStackHttpHandlerFactory.GetHandler returns:
A matching RawHttpHandler, if any.
If the domain root, the handler returned by GetCatchAllHandlerIfAny(...), if any.
If the route matches a metadata uri (I'm skipping over the exact logic here, as it's not important for your question), the relevant handler, if any.
The handler returned by ServiceStackHttpHandlerFactory.GetHandlerForPathInfo if any.
NotFoundHandler.
ServiceStackHttpHandlerFactory.GetHandlerForPathInfo returns:
If the url matches a valid REST route, a new RestHandler.
If the url matches an existing file or directory, it returns
the handler returned by GetCatchAllHandlerIfAny(...), if any.
If it's a supported filetype, a StaticFileHandler,
If it's not a supported filetype, the ForbiddenHttpHandler.
The handler returned by GetCatchAllHandlerIfAny(...), if any.
null.
The CatchAllHandlers array contains functions that evaluate the url and either return a handler, or null. The functions in the array are called in sequence and the first one that doesn't return null handles the route. Let me highlight some key elements:
First, the plugin adds a CatchAllHandler to the appHost.CatchAllHandlers array when registered.
public void Register(IAppHost appHost)
{
appHost.CatchAllHandlers.Add((string method, string pathInfo, string filepath) =>
Factory(method, pathInfo, filepath));
}
Second, the CatchAllHandler. As described above, the function may be called for the domain root, an existing file or directory, or any other unmatched route. Your method should return a handler, if your criteria are met, or return null.
private static Html5ModeFeature Factory(String method, String pathInfo, String filepath)
{
var Html5ModeHandler = Html5ModeFeature.Instance;
List<string> WebHostRootFileNames = RootFiles();
// handle domain root
if (string.IsNullOrEmpty(pathInfo) || pathInfo == "/")
{
return Html5ModeHandler;
}
// don't handle 'mode' urls
var mode = EndpointHost.Config.ServiceStackHandlerFactoryPath;
if (mode != null && pathInfo.EndsWith(mode))
{
return null;
}
var pathParts = pathInfo.TrimStart('/').Split('/');
var existingFile = pathParts[0].ToLower();
var catchAllHandler = new Object();
if (WebHostRootFileNames.Contains(existingFile))
{
var fileExt = Path.GetExtension(filepath);
var isFileRequest = !string.IsNullOrEmpty(fileExt);
// don't handle directories or files that have another handler
catchAllHandler = GetCatchAllHandlerIfAny(method, pathInfo, filepath);
if (catchAllHandler != null) return null;
// don't handle existing files under any event
return isFileRequest ? null : Html5ModeHandler;
}
// don't handle non-physical urls that have another handler
catchAllHandler = GetCatchAllHandlerIfAny(method, pathInfo, filepath);
if (catchAllHandler != null) return null;
// handle anything else
return Html5ModeHandler;
}
In the case of the wildcard at the root domain, you may not want to hijack routes that can be handled by another CatchAllHandler. If so, to avoid infinite recursion, you'll need a custom GetCatchAllHandlerIfAny method.
//
// local copy of ServiceStackHttpHandlerFactory.GetCatchAllHandlerIfAny, prevents infinite recursion
//
private static IHttpHandler GetCatchAllHandlerIfAny(string httpMethod, string pathInfo, string filePath)
{
if (EndpointHost.CatchAllHandlers != null)
{
foreach (var httpHandlerResolver in EndpointHost.CatchAllHandlers)
{
if (httpHandlerResolver == Html5ModeFeature.Factory) continue; // avoid infinite recursion
var httpHandler = httpHandlerResolver(httpMethod, pathInfo, filePath);
if (httpHandler != null)
return httpHandler;
}
}
return null;
}
Here's the complete, and completely untested, plugin. It compiles. It carries no warranty of fitness for any specific purpose.
using ServiceStack;
using ServiceStack.Common.Web;
using ServiceStack.Razor;
using ServiceStack.ServiceHost;
using ServiceStack.Text;
using ServiceStack.WebHost.Endpoints;
using ServiceStack.WebHost.Endpoints.Formats;
using ServiceStack.WebHost.Endpoints.Support;
using ServiceStack.WebHost.Endpoints.Support.Markdown;
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Web;
namespace MyProject.Support
{
public enum DefaultFileFormat
{
Markdown,
Razor,
Static
}
public class Html5ModeFeature : EndpointHandlerBase, IPlugin
{
private FileInfo fi { get; set; }
private DefaultFileFormat FileFormat { get; set; }
private DateTime FileModified { get; set; }
private byte[] FileContents { get; set; }
public MarkdownHandler Markdown { get; set; }
public RazorHandler Razor { get; set; }
public object Model { get; set; }
private static Dictionary<string, string> allDirs;
public string PathInfo { get; set; }
public void Register(IAppHost appHost)
{
appHost.CatchAllHandlers.Add((string method, string pathInfo, string filepath) =>
Factory(method, pathInfo, filepath));
}
private Html5ModeFeature()
{
foreach (var defaultDoc in EndpointHost.Config.DefaultDocuments)
{
if (PathInfo == null)
{
var defaultFileName = Path.Combine(Directory.GetCurrentDirectory(), defaultDoc);
if (!File.Exists(defaultFileName)) continue;
PathInfo = (String)defaultDoc; // use first default document found.
}
}
SetFile();
}
private static Html5ModeFeature instance;
public static Html5ModeFeature Instance
{
get { return instance ?? (instance = new Html5ModeFeature()); }
}
public void SetFile()
{
if (PathInfo.EndsWith(MarkdownFormat.MarkdownExt) || PathInfo.EndsWith(MarkdownFormat.TemplateExt))
{
Markdown = new MarkdownHandler(PathInfo);
FileFormat = DefaultFileFormat.Markdown;
return;
}
if (PathInfo.EndsWith(Razor.RazorFormat.RazorFileExtension)) {
Razor = new RazorHandler(PathInfo);
FileFormat = DefaultFileFormat.Razor;
return;
}
FileContents = File.ReadAllBytes(PathInfo);
FileModified = File.GetLastWriteTime(PathInfo);
FileFormat = DefaultFileFormat.Static;
}
//
// ignore request.PathInfo, return default page, extracted from StaticFileHandler.ProcessResponse
//
public void ProcessStaticPage(IHttpRequest request, IHttpResponse response, string operationName)
{
response.EndHttpHandlerRequest(skipClose: true, afterBody: r =>
{
TimeSpan maxAge;
if (r.ContentType != null && EndpointHost.Config.AddMaxAgeForStaticMimeTypes.TryGetValue(r.ContentType, out maxAge))
{
r.AddHeader(HttpHeaders.CacheControl, "max-age=" + maxAge.TotalSeconds);
}
if (request.HasNotModifiedSince(fi.LastWriteTime))
{
r.ContentType = MimeTypes.GetMimeType(PathInfo);
r.StatusCode = 304;
return;
}
try
{
r.AddHeaderLastModified(fi.LastWriteTime);
r.ContentType = MimeTypes.GetMimeType(PathInfo);
if (fi.LastWriteTime > this.FileModified)
SetFile(); //reload
r.OutputStream.Write(this.FileContents, 0, this.FileContents.Length);
r.Close();
return;
}
catch (Exception ex)
{
throw new HttpException(403, "Forbidden.");
}
});
}
private void ProcessServerError(IHttpRequest httpReq, IHttpResponse httpRes, string operationName)
{
var sb = new StringBuilder();
sb.AppendLine("{");
sb.AppendLine("\"ResponseStatus\":{");
sb.AppendFormat(" \"ErrorCode\":{0},\n", 500);
sb.AppendFormat(" \"Message\": HTML5ModeHandler could not serve file {0}.\n", PathInfo.EncodeJson());
sb.AppendLine("}");
sb.AppendLine("}");
httpRes.EndHttpHandlerRequest(skipClose: true, afterBody: r =>
{
r.StatusCode = 500;
r.ContentType = ContentType.Json;
var sbBytes = sb.ToString().ToUtf8Bytes();
r.OutputStream.Write(sbBytes, 0, sbBytes.Length);
r.Close();
});
return;
}
private static List<string> RootFiles()
{
var WebHostPhysicalPath = EndpointHost.Config.WebHostPhysicalPath;
List<string> WebHostRootFileNames = new List<string>();
foreach (var filePath in Directory.GetFiles(WebHostPhysicalPath))
{
var fileNameLower = Path.GetFileName(filePath).ToLower();
WebHostRootFileNames.Add(Path.GetFileName(fileNameLower));
}
foreach (var dirName in Directory.GetDirectories(WebHostPhysicalPath))
{
var dirNameLower = Path.GetFileName(dirName).ToLower();
WebHostRootFileNames.Add(Path.GetFileName(dirNameLower));
}
return WebHostRootFileNames;
}
private static Html5ModeFeature Factory(String method, String pathInfo, String filepath)
{
var Html5ModeHandler = Html5ModeFeature.Instance;
List<string> WebHostRootFileNames = RootFiles();
// handle domain root
if (string.IsNullOrEmpty(pathInfo) || pathInfo == "/")
{
return Html5ModeHandler;
}
// don't handle 'mode' urls
var mode = EndpointHost.Config.ServiceStackHandlerFactoryPath;
if (mode != null && pathInfo.EndsWith(mode))
{
return null;
}
var pathParts = pathInfo.TrimStart('/').Split('/');
var existingFile = pathParts[0].ToLower();
var catchAllHandler = new Object();
if (WebHostRootFileNames.Contains(existingFile))
{
var fileExt = Path.GetExtension(filepath);
var isFileRequest = !string.IsNullOrEmpty(fileExt);
// don't handle directories or files that have another handler
catchAllHandler = GetCatchAllHandlerIfAny(method, pathInfo, filepath);
if (catchAllHandler != null) return null;
// don't handle existing files under any event
return isFileRequest ? null : Html5ModeHandler;
}
// don't handle non-physical urls that have another handler
catchAllHandler = GetCatchAllHandlerIfAny(method, pathInfo, filepath);
if (catchAllHandler != null) return null;
// handle anything else
return Html5ModeHandler;
}
//
// Local copy of private StaticFileHandler.DirectoryExists
//
public static bool DirectoryExists(string dirPath, string appFilePath)
{
if (dirPath == null) return false;
try
{
if (!ServiceStack.Text.Env.IsMono)
return Directory.Exists(dirPath);
}
catch
{
return false;
}
if (allDirs == null)
allDirs = CreateDirIndex(appFilePath);
var foundDir = allDirs.ContainsKey(dirPath.ToLower());
//log.DebugFormat("Found dirPath {0} in Mono: ", dirPath, foundDir);
return foundDir;
}
//
// Local copy of private StaticFileHandler.CreateDirIndex
//
static Dictionary<string, string> CreateDirIndex(string appFilePath)
{
var indexDirs = new Dictionary<string, string>();
foreach (var dir in GetDirs(appFilePath))
{
indexDirs[dir.ToLower()] = dir;
}
return indexDirs;
}
//
// Local copy of private StaticFileHandler.GetDirs
//
static List<string> GetDirs(string path)
{
var queue = new Queue<string>();
queue.Enqueue(path);
var results = new List<string>();
while (queue.Count > 0)
{
path = queue.Dequeue();
try
{
foreach (string subDir in Directory.GetDirectories(path))
{
queue.Enqueue(subDir);
results.Add(subDir);
}
}
catch (Exception ex)
{
Console.Error.WriteLine(ex);
}
}
return results;
}
//
// local copy of ServiceStackHttpHandlerFactory.GetCatchAllHandlerIfAny, prevents infinite recursion
//
private static IHttpHandler GetCatchAllHandlerIfAny(string httpMethod, string pathInfo, string filePath)
{
if (EndpointHost.CatchAllHandlers != null)
{
foreach (var httpHandlerResolver in EndpointHost.CatchAllHandlers)
{
if (httpHandlerResolver == Html5ModeFeature.Factory) continue; // avoid infinite recursion
var httpHandler = httpHandlerResolver(httpMethod, pathInfo, filePath);
if (httpHandler != null)
return httpHandler;
}
}
return null;
}
public override void ProcessRequest(IHttpRequest httpReq, IHttpResponse httpRes, string operationName)
{
switch (FileFormat)
{
case DefaultFileFormat.Markdown:
{
Markdown.ProcessRequest(httpReq, httpRes, operationName);
break;
}
case DefaultFileFormat.Razor:
{
Razor.ProcessRequest(httpReq, httpRes, operationName);
break;
}
case DefaultFileFormat.Static:
{
fi.Refresh();
if (fi.Exists) ProcessStaticPage(httpReq, httpRes, operationName); else ProcessServerError(httpReq, httpRes, operationName);
break;
}
default:
{
ProcessServerError(httpReq, httpRes, operationName);
break;
}
}
}
public override object CreateRequest(IHttpRequest request, string operationName)
{
return null;
}
public override object GetResponse(IHttpRequest httpReq, IHttpResponse httpRes, object request)
{
return null;
}
}
}

Resources