hi we all know how the infopath is working. when we give details in infopath form the data is stored as XML File. And when we open it the data from XML are merged with infopath form design(template).
I need to show the data which is present in XML with my own designed(template) form in the runtime. My question is how can i read the fields which is in Infopath Form(in .xml). Then only i can design my own form in the runtime.
i dont know if i understood your question correctly, but if you want to display xml data in another form you have to transform the xml or at least include it as a datasource in your own designed form.
public void SaveXML(string filePath, string sourceXpath)
{
XPathNavigator myRoot = MainDataSource.CreateNavigator();
XPathNavigator node = myRoot.SelectSingleNode(sourceXpath, NamespaceManager);
using (StreamWriter sw = new StreamWriter(filePath))
{
sw.Write(node.OuterXml.ToString());
sw.Flush();
sw.Close();
}
}
//usage: make sure you understand reading a bit about xpath first, and you will easily know what to put here in param2. you can right click on on any node in the datasource tree of your infopath and select "copy xpath" string to clipboard.
SaveXML(_AuditsFolder + "\\" + fname, "/my:AuditForm/my:Audit");
Related
We are looking to add Microsoft Reports - SSRS to one of our internal websites.
The database has all the reporting features installed.
The website is using Entity Framework 4 for all data.
I have been able to create a report using the old fashioned way of creating a DataSet (*.XSD) and this works well.
My question though, is it possible to utilise the existing Entity Framework in the site for the data required by the reports? Rather than having to re-invent the wheel and make a whole DataSet, along with relationships etc..
It's a website and not application, so this (http://weblogs.asp.net/rajbk/archive/2010/05/09/creating-an-asp-net-report-using-visual-studio-2010-part-1.aspx) doesn't seem to apply; I don't see the DataSource (in part 2 of the tutorial)
Update
As a side-note, we would like to steer clear of expensive third-party controls etc.
Also, another way to look at the issue might be to generate the *.XSD from the entity framework entity model; is this possible? It's not ideal though would get us up and running..
Below is a quick sample of how i set the report datasource in one of my .NET winForms applications.
public void getMyReportData()
{
using (myEntityDataModel v = new myEntityDataModel())
{
var reportQuery = (from r in v.myTable
select new
{
l.ID,
l.LeaveApplicationDate,
l.EmployeeNumber,
l.EmployeeName,
l.StartDate,
l.EndDate,
l.Supervisor,
l.Department,
l.Col1,
l.Col2,
.......,
.......,
l.Address
}).ToList();
reportViewer1.LocalReport.DataSources.Clear();
ReportDataSource datasource = new ReportDataSource("nameOfReportDataset", reportQuery);
reportViewer1.LocalReport.DataSources.Add(datasource);
Stream rpt = loadEmbededReportDefinition("Report1.rdlc");
reportViewer1.LocalReport.LoadReportDefinition(rpt);
reportViewer1.RefreshReport();
//Another way of setting the reportViewer report source
string exeFolder = Path.GetDirectoryName(Application.ExecutablePath);
string reportPath = Path.Combine(exeFolder, #"rdlcReports\Report1.rdlc");
reportViewer1.LocalReport.ReportPath = reportPath;
reportParameter p = new ReportParameter("DeptID", deptID.ToString());
reportViewer1.LocalReport.SetParameters(new[] { p });
}
}
public static Stream loadEmbededReportDefinition(string reportName)
{
Assembly _assembly = Assembly.GetExecutingAssembly();
Stream _reportStream = _assembly.GetManifestResourceStream("ProjectNamespace.rdlcReportsFolder." + reportName);
return _reportStream;
}
My approach has always been to use RDLC files with object data sources and run them in 'local' mode. These data sources are ... my entities! This way, I'm using all of the same business logic, string formatting, culture awareness, etc. that I use for my web apps. There are a some quirks, but I've been able to live with them:
RDLC files don't like to live in web projects. We create a separate dummy winform project and add the RDLC files there.
I don't show reports in a viewer. I let the user download a PDF, Word, or Excel file and choose to save or open in the native viewer. This saves a bunch of headaches, but can put some folks off, depending on requirements. For mobile devices, it's pretty nice.
Since you are not using SSRS, you don't get the nice subscription feature. You are going to build that, if required. In many ways, though, I prefer this.
However, the benefits are really nice:
I'm using all of the same business logic goodness that I've already written for my views.
I have a custom ReportActionResult and DownloadReport controller method that allows me to essentially run any report via a single URL. This can be VERY handy. It sure makes a custom subscription component easier.
Report development seems to go pretty quick, now that I only need to adjust entity partial classes to tweak a little something here or there. Also - If I need to shape the data just a bit differently, I have LINQ.
We too use SSRS as "local" reports. We create Views in SQL server, then create that Object in our application along with the other EF Domain Models, and query that object using our DbContext. We use an ASPX page and use the code behind (Page_Load) to get the data passed to the report.
Here is an example of how we query it in the Page_Load Event:
var person = MyDbContext
.Query<ReportModel>()
.Where(x => x.PersonId == personId)
.Where(x => x.Year == year)
.Select(x =>
{
PersonId = x.PersonId,
Year = x.Year,
Name = x.Name
});
var datasource = new ReportDataSource("DataSet1", person.ToList());
if (!Page.IsPostBack)
{
myReport.Visible = true;
myReport.ProcessingMode = ProcessingMode.Local;
myReport.LocalReport.ReportPath = #"Areas\Person\Reports\PersonReport.rdlc";
}
myReport.LocalReport.DataSources.Clear();
myReport.LocalReport.DataSources.Add(datasource);
myReport.LocalReport.Refresh();
The trick is to create a report (.rdlc) with a blank data source connection string, a blank query block and a blank DataSetInfo (I had to modify the xml manually). They must exist in file and be blank as follows:
SomeReport.rdlc (viewing as xml)
...
<DataSources>
<DataSource Name="conx">
<ConnectionProperties>
<DataProvider />
<ConnectString />
</ConnectionProperties>
<rd:DataSourceID>19f59849-cdff-4f18-8611-3c2d78c44269</rd:DataSourceID>
</DataSource>
</DataSources>
...
<Query>
<DataSourceName>conx</DataSourceName>
<CommandText />
<rd:UseGenericDesigner>true</rd:UseGenericDesigner>
</Query>
<rd:DataSetInfo>
<rd:DataSetName>SomeDataSetName</rd:DataSetName>
</rd:DataSetInfo>
now in a page event, I use a SelectedIndexChanged on a DropDownList, bind the report datasource as follows:
protected void theDropDownList_SelectedIndexChanged(object sender, EventArgs e)
{
if (theDropDownList.SelectedIndex == 0)
return;
var ds = DataTranslator.GetRosterReport(Int64.Parse(theDropDownList.SelectedValue));
_rvReport.LocalReport.ReportPath = "SomePathToThe\\Report.rdlc";
_rvReport.LocalReport.DataSources.Add(new ReportDataSource("SomeDataSetName", ds));
_rvReport.Visible = true;
_rvReport.LocalReport.Refresh();
}
You can use a WCF-Service as Datasource and so re-use your application data and logic for your report. This requires a SQL-server standard edition at least i believe. So no can do with the free SQL-express edition.
You can use LINQ with RDLC Report which is quite easy to use
LinqNewDataContext db = new LinqNewDataContext();
var query = from c in db.tbl_Temperatures
where c.Device_Id == "Tlog1"
select c;
var datasource = new ReportDataSource("DataSet1", query.ToList());
ReportViewer1.Visible = true;
ReportViewer1.ProcessingMode = ProcessingMode.Local;
ReportViewer1.LocalReport.ReportPath = #"Report6.rdlc";
ReportViewer1.LocalReport.DataSources.Clear();
ReportViewer1.LocalReport.DataSources.Add(datasource);
ReportViewer1.LocalReport.Refresh();
I am using asp.net VB and I have an XML file containing a set of data, I would like to use it in something like a datalist and where usually you would use a database i would like to use the XML file to produce the information.
Does anyone know how to do this, i have read about transform files but surely i will format the information in the control?
The file has multiple records so in some cases i would need to perform queries on the information through the datasource.
I would maybe look into XML serialization and de-serialization. Using de-serialization you could read your XML into a List(T) object containing a list of your own class objects and use that as a data source for your application.
Heres a link that you may find useful:
http://msdn.microsoft.com/en-us/library/ms731073.aspx
Hope this helps.
Dim ds As New DataSet()
ds.ReadXml(MapPath("data.xml"))
First you have to parse the XML and store that into custom C# object or you can directly pass the XML to your stored procedure and do the codding there for saving it into DB.
Passing the xml to stored procedure and manipulating it there is bit difficult so what I suggest is to parse it in C# and then get a custom object. Once you get it you can do whatever you want to.
Below is the sample code that parse a XML file and generate a custom C# object from it.
public CatSubCatList GenerateCategoryListFromProductFeedXML()
{
string path = System.Web.HttpContext.Current.Server.MapPath(_xmlFilePath);
XDocument xDoc = XDocument.Load(path);
XElement xElement = XElement.Parse(xDoc.ToString());
List<Category> lstCategory = xElement.Elements("Product").Select(d => new Category
{
Code = Convert.ToString(d.Element("CategoryCode").Value),
CategoryPath = d.Element("CategoryPath").Value,
Name = GetCateOrSubCategory(d.Element("CategoryPath").Value, 0), // Category
SubCategoryName = GetCateOrSubCategory(d.Element("CategoryPath").Value, 1) // Sub Category
}).GroupBy(x => new { x.Code, x.SubCategoryName }).Select(x => x.First()).ToList();
CatSubCatList catSubCatList = GetFinalCategoryListFromXML(lstCategory);
return catSubCatList;
}
Good Afternoon Developers,
This is my function to generate the crystal Report,This will take the Report Name and ReportQuery as a input parameter and will generate a dataset , With this dataset how can i design my Report??Because this is Generated at the runtime , How can i access that to generate my ReportDesign??
public void fnLoadDataToReport(string rptName, string rptQuery)
{
try
{
DataSet myDS=new DataSet();
// crReportDocument.Load(Server.MapPath("Reports\" & RptName), OpenReportMethod.OpenReportByTempCopy);
crReportDocument.Load(Server.MapPath("Reports\\" + rptName ));
SqlConnection myConnection=new SqlConnection(ConfigurationManager.ConnectionStrings["mycon"].ConnectionString);
SqlCommand myCommand=new SqlCommand(rptQuery,myConnection);
SqlDataAdapter MyDA=new SqlDataAdapter(myCommand);
MyDA.Fill(myDS,"ReportTable");
crReportDocument.SetDataSource(myDS);
crvReportGeneration.ReportSource=crReportDocument;
crvReportGeneration.ShowFirstPage();
}
catch(Exception ex)
{
Response.Write(ex.Message);
}
}
Any help is appreciable....!
All you need is an XML schema definition of the Dataset to be able to design the report. this can be achieved by the following code.
private void WriteSchemaToFile(DataSet thisDataSet){
// Set the file path and name. Modify this for your purposes.
string filename="mySchema.xml";
// Write the schema to the file.
thisDataSet.WriteXmlSchema(filename);
}
now add the above created file to your project.
Now while designing the report select Database expert and select the xml file under Project. and add it (just like you add tables or views).
Now you are good to go. The rest of the steps are same.
Just make sure the name of the XSD schema and the table name in your Dataset match exactly.
Why are using DataSet as the source for the report. Instead of that if you use a stored procedure as the source, it will solve your problem and also maintenance will be much more easier
I have a lot of XSL files in my ASP.NET web app. A lot. I generate a bunch of AJAX HTML responses using this kind of generic transform method:
public void Transform(XmlDocument xml, string xslPath)
{
...
XslTransform myXslTrans = new XslTransform();
myXslTrans.Load(xslPath);
myXslTrans.Transform(xml,null, HttpContext.Current.Response.Output);
}
I'd like to move the XSL definitions into SQL Server, using a column of type xml.
I would store an entire XSL file in a single row in SQL, and each XSL is self-contained (no imports). I would read out the XSL definition from SQL into my XslTransform object.
Something like this:
public void Transform(XmlDocument xml, string xslKey)
{
...
SqlCommand cmd = new SqlCommand("GetXslDefinition");
cmd.AddParameter("#xslKey", SqlDbType.VarChar).Value = xslKey;
// where the result set has a single column of XSL: "<xslt:stylesheet>..."
...
SqlDataReader dr = cmd.ExecuteReader();
if(dr.Read()) {
SqlXml xsl = dr.GetSqlXml(0);
XslTransform myXslTrans = new XslTransform();
myXslTrans.Load(xsl.CreateReader());
myXslTrans.Transform(xml,null, HttpContext.Current.Response.Output);
}
}
It seems like a straightforward way to:
add metadata to each XSL, like lastUsed, useCount, etc.
bulk update/search capabilities
prevent lots of disk access
avoid referencing relative paths and organizing files
allow XSL changes without redeploying (I could even write an admin page that selects/updates the XSL in the database)
Has anyone tried this before? Are there any caveats?
EDIT
Caveats that responders have listed:
disk access isn't guaranteed to diminish
this will break xsl:includes
The two big issues I can see are:
We use a lot of includes to ensure that we only do things once, storing the XSLT in the database would stop us from doing that.
It makes updating XSLs more interesting - we've been quite happy to dump new .xsl files into deployed sites without doing a full update of the site. For that matter we've got bits of code that look for client specific xsl in a folder and those bits of code can reach back up to common code (templates) in the root - so I'm not sure about the redeploy thing at all, but this will depend very much on the particular use case, yours is certainly different to ours.
In terms of disk access, hmm... the db still has to go access the disk to pull the data and if you're talking about caching then the db isn't a requirement for enabling caching.
Have to agree about the update/search options - you can do stuff with Powershell but that needs to be run on the server and that's not always a good idea.
Technically I can see no reason why not (excepting the wish to do includes as above) but practically it seems to be fairly balanced with good arguments either way.
I store XSLTs in a database in my application dbscript. (However I keep them in an NVARCHAR column, since it also runs on SQL Server 2000)
Since users are able to edit their XSLTs, I needed to write a custom validator which loads the text of TextBox in a .Net XslCompiledTransform object like this:
args.IsValid = true;
if (args.Value.Trim() == "")
return;
try
{
System.IO.TextReader rd = new System.IO.StringReader(args.Value);
System.Xml.XmlReader xrd = System.Xml.XmlReader.Create(rd);
System.Xml.Xsl.XslCompiledTransform xslt = new System.Xml.Xsl.XslCompiledTransform();
System.Xml.Xsl.XsltSettings xslts = new System.Xml.Xsl.XsltSettings(false, false);
xslt.Load(xrd, xslts, new System.Xml.XmlUrlResolver());
xrd.Close();
}
catch (Exception ex)
{
this.ErrorMessage = (string.IsNullOrEmpty(sErrorMessage) ? "" : (sErrorMessage + "<br/>") +
ex.Message);
if (ex.InnerException != null)
{
ex = ex.InnerException;
this.ErrorMessage += "<br />" + ex.Message;
}
args.IsValid = false;
}
As for your points:
file I/O will be replaced by database-generated disk I/O, so no gains there
deployment changes to providing an INSERT/UPDATE script containing the new data
I am looking for a solution or recommendation to a problem I am having. I have a bunch of ASPX pages that will be localized and have a bunch of text that needs to be supported in 6 languages.
The people doing the translation will not have access to Visual Studio and the likely easiest tool is Excel. If we use Excel or even export to CSV, we need to be able to import to move to .resx files. So, what is the best method for this?
I am aware of this question, Convert a Visual Studio resource file to a text file? already and the use of Resx Editor but an easier solution would be preferred.
I'm not sure how comprehensive an answer you're looking for, but if you're really just using [string, string] pairs for your localization, and you're just looking for a quick way to load resource (.resx) files with the results of your translations, then the following will work as a fairly quick, low-tech solution.
The thing to remember is that .resx files are just XML documents, so it should be possible to manually load your data into the resource from an external piece of code. The following example worked for me in VS2005 and VS2008:
namespace SampleResourceImport
{
class Program
{
static void Main(string[] args)
{
XmlDocument doc = new XmlDocument();
string filePath = #"[file path to your resx file]";
doc.Load(filePath);
XmlElement root = doc.DocumentElement;
XmlElement datum = null;
XmlElement value = null;
XmlAttribute datumName = null;
XmlAttribute datumSpace = doc.CreateAttribute("xml:space");
datumSpace.Value = "preserve";
// The following mocks the actual retrieval of your localized text
// from a CSV or ?? document...
// CSV parsers are common enough that it shouldn't be too difficult
// to find one if that's the direction you go.
Dictionary<string, string> d = new Dictionary<string, string>();
d.Add("Label1", "First Name");
d.Add("Label2", "Last Name");
d.Add("Label3", "Date of Birth");
foreach (KeyValuePair<string, string> pair in d)
{
datum = doc.CreateElement("data");
datumName = doc.CreateAttribute("name");
datumName.Value = pair.Key;
value = doc.CreateElement("value");
value.InnerText = pair.Value;
datum.Attributes.Append(datumName);
datum.Attributes.Append(datumSpace);
datum.AppendChild(value);
root.AppendChild(datum);
}
doc.Save(filePath);
}
}
}
Obviously, the preceding method won't generate the code-behind for your resource, however opening the resource file in Visual Studio and toggling the accessibility modifier for the resource will (re)generate the static properties for you.
If you're looking for a completely XML-based solution (vs. CSV or Excel interop), you could also instruct your translators to store their translated content in Excel, saved as XML, then use XPath to retrieve your localization info. The only caveat being the file sizes tend to become pretty bloated.
Best of luck.
I ran into similar problem and realized that the simplest way to create a .resx file from excel file is using a concatenate function of excel to generate "<"data">".."<"/data">" node for the .resx file and then manually copying the generated rows to the .resx file in any text editor. So lets say that you have "Name" in column A of an excel document and "value" in Column B of the excel document. Using following formula in Column C
=CONCATENATE("<data name=","""",A14,""" xml:space=""preserve"">","<value>", B14, "</value>", "</data>")
you will get the data node for resource. You can then copy this formula to all the rows and then copy the contents of Column C in your .resx file.
If it's in csv here's a quick Ruby script to generate the data elements.
require 'csv'
require 'builder'
file = ARGV[0]
builder = Builder::XmlMarkup.new(:indent => 2)
CSV.foreach(file) do |row|
builder.data(:name => row[0], "xml:space" => :preserve) {|d| d.value(row[1]) }
end
File.open(file + ".xml", 'w') { |f| f.write(builder.target!) }