I’m creating an application that has a data dependency on another group’s data feed. They can give me a daily xml dump of the data that I can simply load into cache once a day OR I can make calls to a web service to get the data that way. If the data provider doesn't care which I use (same work for them either way) which should I ask for and why? Using asp.net cache web site etc…
This depends entirely on the API exposed by their web service. If you are in a situation where you will need all the information, all at once, once a day, and there is no need to ever ask for anything more, then a simple XML reader may be all that you need.
On the other hand, the much more extensible solution is to hook into their web service, because then you can customize what kind of information you are gathering. If your requirements could update during the course of a day, or you don't need ALL the information, just a subset at any given time, then going through the web service may be better.
Ultimately, the better option depends on the business requirements. Would their web service be able to give you information that is formatted in a more useful way, and filtered in a relevant way? If so, you should go that route. If all you need is ALL the data, then XML might be simpler.
For this I'd choose to get an Atom feed, from which you can take the benefits of HTTP for caching and updating, and is also a valid XML feed. This way you don't have to process more data than what you actually need. XML dumps can be terrible for updates, analysis and implementation, while WebServices are terrible in general as you need to grok the WS-* stack of standards until your requirements are met (and possibly pollute your development environment with tools you rarely need). Alternatively I'd ask for a basic REST interface into the data, say one URL pr. day of updates or whatever you think you need.
Related
I need to access some data on my asp.net website. The data relates to around 50 loan providers.
I could simply build it into the web page at the moment, however I know that I will need to re-use it soon, so its probably better to make it more accessisble.
The data will probably only change once in a while, maybe once a month at most. I was looking at the best method of storing the data - database/xml file, and then how to persist that in my site (cache perhaps).
I have very little experience so would appreciate any advice.
It's hard to beat a database, and by placing it there, you could easily access it from anywhere you wanted to reuse it. Depending on how you get the updates and what DBMS you are using, you could use something like SSIS (for MS SQL Server) to automate updating the data.
ASP.NET also has a robust API for interacting with a database and using it as a datasource for many of it's UI structures.
Relational databases are tools for storing data when access to the data needs to be carefully controlled to ensure that it is atomic, consistent, isolated, and durable. (ACID). To accomplish this, databases include significant additional infrastructure overhead and processing logic. If you don't need this overhead why subject your system to it? There is a broad range of other data storage options at your disposal that might be more appropriate, but should at least be considered options in your decision process.
Using Asp.Net, you have access to several other options, including text files, custom configuration files (stored as Xml), custom Xml, and dotNet classes serialized to binary or Xml files. The fact that your data changes so infrequently may make one of these options more appropriate. Using one of these options also reduces system coupling. Functions dependent on this data are now no longer dependent on the existence of a functioning database.
I am implementing a IIS Hosted WCF web service to accept leads from third parties. There are plenty of operations that may happen pre/post saving the information. I am thinking of implementing this as plug-in based architecture.
Example of pre-save operations are
duplicate checking before saving
making sure the information is valid (not mickey mouse)
Post-save operations are
zipcode based routing to correct warehouse
lead assignment.
I have been reading about MEF, but i have been unable to decide if this is actually worth implementing MEF where loading and unloading of plugins for every call will likely increase the overhead? Is there a way to just load all your plugins in some magic application_start?
I agree with Steven that you don't need a plugin architecture for this. You are good to go with just properly designed services. There are some very good hints about this in Steven's blog post - Writing Highly Maintainable WCF Services.
Nonetheless, to answer the second part of the question, there is nothing stopping you from initializing the MEF composition container in your Application_Start and store it statically (other than that it introduces a global state, which is often a bad design decision). Then it would be shared across the requests and you could use it to compose parts as needed without the overhead of repeated exports discovery.
We’re currently evaluating development with Sitecore 6 for a project. The client already bought it, so using another CMS isn't an option. The proposed setup would have Sitecore as our site’s content data provider; which would be consumed by a site built in ASP.Net MVC 3. We’d use Sitecore’s libraries to retrieve data from the Sitecore database on the server side.
In some cases, we also may want to consume content data via client side AJAX calls. I’ve been working on prototypes for this to see what data I can get back from a custom proxy service. This service calls GetOuterXml on the item, converts the Xml to JSON, and sends back the JSON to the calling script. So far, I’m finding using this method limiting; as it appears GetOuterXml only returns fields and values for fields that were set on the specific item, ignoring the template’s standard value fields and their default values for example. I tried Item.Fields.ReadAll(), still wouldn’t return the standard values. Also, there are circular references in the Item graph (item.Fields[0].Item.Fields[0]...); which has made serialization quite difficult without having to write something totally custom.
Needless to say, I've been running into many roadblocks on my path down this particular road and am definitely leaning toward doing things the Sitecore way. However, my team really wants to use MVC for this project; so before I push back on this, I feel its my responsibility to do some due diligence and reach out to the community to see if anyone else has tried this.
So my question is, as a Sitecore developer, have you ever used Sitecore as purely a content data provider on the client-side and/or server-side? If you have, have you encountered similar issues and were you able to resolve them? I know by using Sitecore in this way; you lose a lot of features such as content routing/aliasing, OMS, the rendering and layout engine; among other features. I’m not saying we’re definitely going down this path, we’re just at the R&D phase of using Sitecore and determining how it would best be utilized by our team and our development practices. Any constructive input is greatly appreciated.
Cheers,
Frank
I don't have experience with trying to use Sitecore solely as a data provider, but my first reaction to what you're suggesting is DON'T!
Sitecore offers extremely rich functionality which is directly integrated into ASP.Net and configured from within the Sitecore UI. Stripping that off and rebuilding it in MVC is lnot so much reinventing the wheel as reinventing the car.
I think that in 6.4 you can use some MVC alongside Sitecore, so you may be able to provide a sop to your colleagues with that.
Let me start off by stating that I am a novice developer, so please excuse the elementary nature of my question(s).
I am currently working on a Flex Application, and am getting more and more confused about when to use server side scripting, and when to develop web services. For most of the functionality I am working on, I am taking various files from the user (client), uploading to the server for processing/conversion, then sending back to client in new format.
I am accomplishing most of this using asp.net generic handlers (ashx) files, but not very confident this is best practice. But at the same time, does making web services make any more sense? What would be considered best practice for this? Any suggestions would be greatly appreciated.
The way I look at it is as follows:
Web Services mean Established Best Practice.
For most of our development, we don't need to create "Web Services", or what I'm thinking when I think REST, SOAP, and the Twitter API. You only need to start doing that once you've got something you're going to be using every day for years.
Clean and DRY code will Lead you to Creating a Web Service
If you spend the time to clearly define the parts of your upload-process-render Architecture, and you find that it can be applied to almost everything you are doing, then all you need to do to make it a Web Service is define a clear, 1-2-3 set of rules for using the system (GET/POST data, etc.). As long as you are consciously building an architecture the whole way, you'll end up creating a Web Service if it's worthy. Otherwise there's no need.
It sounds like you have a clear workflow going, I don't know anything about asp.net though.
As far as it being confusing sometimes, and best practices, I suggest the following:
Create a Flex Library Project for your "generic ashx file handling" Flex classes. Give it a cool, simple name.
Create a .NET Library Project that encapsulates all the logic for your server-side file processing. Host it online and make it open source. I recommend github. Test it as you go, and document it, its purpose and the theory behind it.
If you don't have to do anymore work at this point, and it's just plug and chug, then you've probably arrived at something that might become a Web Service, though that's probably a few years down the road.
I don't think you should try to create a Web Service right off the bat. Just make some clean and reusable code, make a few examples, get it online and open source, have others contribute and give feedback, and if it solves a specific problem, then make it a web service. You can just use REST for now probably, and build your system around that. RestfulX is a great library for that.
Best,
Lance
making web services without any sense make no sense ;)
Now in the world of FLEX as3 with flash version 10, you can easily read local files, modify them with whatever modifcation algorithm and save local files without pinging server.
You only have to use webservices if you want to get some server data or to send some data to server. that's all.
RSTanvir
Flash / Flex uses a simple HTTP POST approach for file uploads, so trying to do that using SOAP web services will be problematic. Your approach of using ASHX here sounds reasonable to me.
To send / receive data that isn't file based (e.g. a list of files the user has uploaded previously), I would recommend looking at the open source Fluorine FX library. Fluorine uses AMF which is a highly performant way of doing data transfer with Flash. It's also purely configuration-based, which means you don't need to code against any of its APIs, just configure Fluorine to expose your .NET service classes. You could easily add attributes to those same classes to expose them as SOAP web services via WCF if you need that in the future. I would not recommend using SOAP with Flex however, due to the performance losses and also because the Flex implementation of SOAP has a history of bugs and interoperability problems.
I am in the early stages of planning a conversion of a large classic ASP database application to ASP.Net and I'm having trouble picking out which data access method to use. I have played around with Linq To SQL, Dynamic Data, strongly typed datasets, Enterprise Library (Data Access Application Blocks), and a tiny bit with Entity Framework, but none of them have jumped out to me as "the one". There are just too many choices - my head is swimming, help me choose!
Perhaps it would help to give some background on the application that I am converting along with the priorities...
The back end is Microsoft SQL Server (2005 or later) and we are committed to that, so I don't need to worry about ever supporting a different database platform.
The database is very mature and contains a great deal of the business logic. It is highly normalized and makes extensive use of stored procedures, triggers, and views. I would rather not reinvent two wheels at the same time, so I'd like to make as few changes to the database as possible. So, I need to choose a data access method that is flexible enough to let me work around any quirks in the database.
The application has many data entry forms and extensive searching and reporting capabilities (reports are another beast which I will tackle later).
The application needs to be flexible enough to deal with minor changes to the database structure. The application (and database) may be installed at different sites where minor custom modifications are made to the database. Ideally the application could identify the database extensions and react appropriately. In other words, if I need to store an O/R mapping in the application, I need to be able to swap that out (or refresh it easily) when installing the application and database at a new site.
Rapid application development is critical. Since the database is already done and the user interface is going to closely match the existing application, I'm hoping to find something where we can crank this out fairly quickly. I am willing to sacrifice not using the absolute latest and greatest technology if it will save time in development. In other words, if there is a steep learning curve to using something like Entity Framework, I'm fine with going something like strongly typed Datasets and a custom DAL if it will speed up the process.
I am a total newbie to ASP.Net but am intimately familiar with Classic ASP, T-SQL and the old ADO (e.g. disconnected recordsets). If any of the data access methods is better suited for someone coming from my background, I might lean in that direction.
Thanks for any advice that you can offer!
Look at all three articles in this series:
High Performance Data Access Layer Architecture Part 1
Great advice.
You may want to look at decoupling the database layer from the asp layer so that you can not only give more flexbility in making the decision, but when you have to make changes to a customer's database you can just swap in a new dll without changing anything else.
By using dependency injection you can use xml to tell the framework which concrete class to use for an interface.
The advantage to doing this is that you can then go with one database approach, and if you later decide to change to another, then you can just change the dll and go on without making any changes to other layers.
Since you are more familiar with it why not just go directly to the database at the moment by making your own connections? Then you can move the rest of your code and along the way you can decide which of the myriad of technologies to use.
For a new application I am working on I am starting with LINQ to SQL for it, mainly because development will be quicker, but, later, if I decide that won't meet my needs I will just swap it out.
nHibernate might be a good fit. You can store the mapping in external configuration files which would solve your needs. Another option might be using ActiveRecord, which is based upon nHibernate.
nHibernate has a neat feature which you might find helpful. It's called a Dynamic property which is basically a name value pair collection populated by pulling the column names from the mapping file. So when you add a column at your client site, you update the mapping file and you'd be able to access the data through a collection on the object.