I want to consume a .NET web-service that will accept SQL statement, for example: select * from my_table order by name and will return that dataset to my Delphi ClientDataSet / disconnected TADODataSet, and will display the result in the TDBGrid.
Part 2) After I update a single record I want to be able to update the .NET dataset via a webservice.
How can I do that? (Code Please)
1) .Net datasets uses XML to transfer it's data, so you can read them as XML then convert them to Delphi Dataset, Look at these articles
Use ADO.NET Datasets in Delphi
Working with .NET data in Delphi
2) As I understand you will using Web services, so it will be better to add an update method to your service and call it to update the data.
One note, IMO, sending raw SQL to web services as the way you would like to use is a bad design, I prefer you to do define your business logic as group of methods, then call them as your application needs.
Also you can use Delphi Prism for more easier and better .Net integration
Related
Right now I'm currently converting WCF web service to DataTable (we know this can get messy), then planning to convert it to SQL Db Type. I was thinking, can't I just consume the WCF service as a SQL Db Type? If so, I've searched and couldn't find a solution to this. What I'm planning to do is sending the DataTable over Sql Data Type.
There exists this approach: http://sharpfellows.com/post/Returning-a-DataTable-over-SqlContextPipe. However, that's a 2006 article and I'd like to skip the .net DataTable.
Code example of how to read WCF as SQL datatype would be much appreciated, thanks!
A DataReader is a class that keep connection open with database, so you cannot consume a service that returns a DataReader.
And return a DataTable is equals a bad idea, you must read the data and return a class with just data.
Maybe a good solution to your scenario can be WCF Data Services, that do what you want: data access.
You can read more here: https://msdn.microsoft.com/en-us/library/cc668794(v=vs.110).aspx
I am already using the standard WebAPI and returning JSON objects to my client. Now I saw an application that returned OData.
Can someone explain if there is any reason for me to use OData if I do not want to query my data from anything other than my own client running in the browser. Are there advantages that I could get through using OData ?
If you are only using your data in your own browser application, there is only few advantages to use OData in your situation:
OData is able to provide metadata about your service interface that can be used to generate client code to access the service. So if you have lots of client classes that you need to create, this could speed up your process. On the other hand, if you can share your classes between the server and an ASP.NET based client or if you only have a few classes, this might not be relevant in your situation.
Another - bigger - advantage in your situation is the support for generic queries against the service data. OData supports IQueryable so that you can decide on the client side on how to filter the data that the service provides. So you do not have to implement various actions or use query parameters to provide filtered data. This also means that if you need a new filter for your client, it is very likely that you do not have to change the server and can just put up the query on the client side. Possible filters include $filter expressions to filter the data, but also operations like $skip and $top that are useful when paging data. For details on OData and queries, see this link.
For a complete overview about OData and Web API see this link.
Here are few advantages of OData.
OData is a open protocol started by Microsoft is based on Rest Services so we can get data base on URL.
It suppport various protocol like http,atom,pub and also support JSON format.
No need to create proxy classes which we used to do it in web service.
You will able to write your own custom methods.
It is very light weight so the interaction between client and server will be fast compared to web service and other technologies.
Very simple to use.
Here are few reference links.
http://sandippatilprogrammer.wordpress.com/2013/12/03/what-is-odata-advantages-and-disadvantages/
http://geekswithblogs.net/venknar/archive/2010/07/08/introduction-odata.aspx
http://www.zdnet.com/blog/microsoft/why-microsofts-open-data-protocol-matters/12700
I agree with the answers already posted, but as an additional insight...
You mentioned that:
... if I do not want to query my data from anything other than my own
client running in the browser...
You may not wish to run it normally through anything but your own cilent, but using oData you could use other querying tools for debugging. For example LinqPad allows you to use oData endpoints (such as that provided by stackoverflow).
It's probably not a good enough reason to implement oData if you don't have another reason to do so, but it's an added bonus.
I'm evaluating some technologies for a new Web Application. Which should use EF5 and Knockout JS with Web API. I wanted to take advantage of the OData feature, when returning IQueryable, but am currently running into the problem, how to convert my EF Models to my Business Models.
As far as I've read, if I want to have a more complex DB (Computed Columns, Stored Procedures, ...) I should use DB First approach. (Correct me if I'm wrong)
Because I need to use DB-First approach and want my models to be Independent of the DB, I need to create them additionally to the EF-Models. And when I return from the DataLayer my Business Model as IQueryable I loose the possibility to execute additional queries directly on the DB but instead they are executed on the ASP.Net server directly.
Of course I don't plan to run complex queries over OData and would anyway implement those as additional actions, but it might be useful on the slower clients (smartphones, ...) to limit the returned data and perform additional filters directly on the server.
Is there any way out of this dilemma, to be still able to use OData?
Regards
Peter
You can try using Code First and EF migrations to create/upgrade database. With migrations you can create custom migrations that can be just SQL scripts to achieve what can't be done automatically with Code First itself. Database First approach is fine as well.
Ask yourself if you really want to/need to support multiple backends. It is possible with EF but hard to maintain. In this case I assume your conceptual model (csdl) would be the same for all databases but you would have multiple store specific models (ssdl files). Since your model would be the same for all databases you would have the same C# types regardless of the database you are using.
When supporting multiple databases you won't be able to run SQL queries against the database (or to be more specific you will get exceptions if you run SQL query specific to one database against a different database) but ideally you should not need it. In the worst you could enclose the logic you would like to code in SQL in a stored procedure that would exist in all databases. Again, I don't know when this would be needed (the only thing that comes to mind is performance) but since you are planning using OData you wouldn't be able to run these queries anyways unless you start using Service Operations.
Since your conceptual model would be the same regardless of the database you would have the same types regardless of the database. You could try using these for both DataLayer and Business Model (especially if you go with POCO). An alternative would be to use POCO/DTOs. (I have not tried OData support in Web API but with WCF Data Services the service itself would actually use EF types so you would not be even able to tell the service to use different set of types).
You actually don't lose the ability with DB first models to execute queries against the business model, as long as your transforms aren't too bad. For example, I have a OData service which has a PersistedCustomer (DB model) and a Customer (Business model). With EF5, I can write the LINQ which transforms the IQueryable to IQueryable, query against the IQueryable and EF can translate the criteria back against the original database.
I'm very new at WCF (and .NET in general), so I apologize if this is common knowledge.
I'm designing a WCF solution (currently using Entity Framework to access the database). I want to grab a (possibly very large) set of data from the database, and return it to the client, but I don't want to serialize the entire set of data over the wire all at once, due to performance concerns.
I'd like to operation to return some sort of object to the client that represents the resulting data and I'd like to deal with that data on the client, being able to navigate through it backwards and forwards and retrieve the actual data over the wire as needed.
I don't want to write a lot client code to individually find out what rows meet my search criteria, then make separate calls to get each record if I can help it. I'm trying to keep the client as simple as possible.
Ideally, I'd like to write the client code similar to something like the below pseudocode:
Reference1.Service1Client MyService = new Reference1.Service1Client("Service1");
DelayedDataSet<MyRecordType> MyResultSet = MyService.GetAllCustomers();
MyResultSet.First();
while (!MyResultSet.Eof)
{
Console.Writeline(MyResultSet.CurrentRecord().CUSTFNAME + " " + MyResultSet.CurrentRecord().CUSTLNAME);
Console.Writeline("Press Enter to see the next customer");
Console.Readline();
MyResultSet.Next();
}
Of course, DelayedDataSet is something I just made up, and I'm hoping something like it exists in .NET.
The call to MyService.GetAllCustomers() would return this DelayedDataSet object, with would not actually contain the actual records. The actual data wouldn't come over the wire until CurrentRecord() is called. Next() and Previous() would simply update a cursor on the server side to point to the appropriate record. I don't want the client to have any direct visibility to the database or Entity Framework.
I'm guessing that the way I wrote the code probably won't work over WCF, and that the functions like CurrentRecord(), Next(), First(), etc. would have to be separate service contract operations. I guess I'm just looking for a way to do this without having to write all my own code to cache the results on the server, somehow persist the data sets server side, write all the retrieval and navigation code in my service library, etc. I'm hoping most of this is already done for me.
It seems like this would be a very commonly needed function. So, does something like this exist?
-Joe
No, that's not what WCF is designed to do.
In WCF, the very basic core architecture is that you have a client and a server, and nothing but (XML-)serialized data going between the two over the wire.
WCF is not a remote-procedure call method, or some sort of remote object mechanism - there is no connection between the client and the server except the serialized message that conforms to the service (and data) contracts defined between the two.
WCF is not designed to handle huge data volumes - it's designed to handle individual messages (GetCustomerByID(42) and such). Since WCF is from the ground up designed to be interoperable with other platforms (non - .NET, too - like Java, Ruby etc.) you should definitely not be using heavy-weight .NET specific types like DataSet anyway - use proper objects.
Also, since WCF ultimately serializes everything to XML and send it across a wire, all the data being passed must be expressible in XML schema - which excludes interfaces and/or generics.
From what I'm reading in your post, what you're looking for is more of a "in-proc" data access layer - not a service level. So if you want to keep going down this path, you should investigate the repository and unit-of-work patterns in conjunction with Entity Framework.
More info:
MSDN: What is Windows Communication Foundation?
WCF Essentials—A Developer's Primer
Picture of the very basic WCF architecture from that Primer - there's only a wire with a serialized message connecting client and server - nothing more; but serialization will always happen
I used to create normal webservices in my websites, and call these services from javascript to make ajax calls.
Now i am learning about Ado Data Services,
My question is:
Does this Ado Data Services can replace my normal webservice in new sites i will create?
And if Yes,
Can i put these Ado Data Services in a separate project "local on the same server" and just reference from my website? "to use the same services for my websites internal use and also give the same services to other websites or services, the same as twitter for example doing"
depends what you want to do , I suggest you read my conversation with Pablo Castro the architect of Ado.Net Data Services
Data Services - Lacking
Here is basically Pablo's words.
I agree that some of these things are quite inconvenient and we're looking at fixing them (e.g. use of custom types in addition to types defined in the input model in order to produce custom result-sets). However, some others are just intrinsic to the nature of Data Services.
The Data Services framework is not a gateway to a database and in general if you need something like that then Data Services will just get in the way. The goal of Data Services is to create a resource model out of an input data model, and expose it with a RESTful interface that exposes the uniform interface, such that every unit of data in the underlying model ("entities") become an addressable resource that can be manipulated with the standard verbs.
Often the actual implementation of a RESTful interface includes more sophisticated behaviors than just doing CRUD over the data under the covers, which need to be defined in a way that doesn't break the uniform interface. That's why the Data Services server runtime has hooks for business logic and validation in the form of query/change interceptors and others. We also acknowledge that it's not always possible or maybe practical to model absolutely everything as resources operated with standard verbs, so we included service operations as a escape-hatch.
Things like joins dilute the abstraction we're trying to create. I'm not saying that they are bad or anything (relational databases without them wouldn't be all that useful), it's just that if what's required for a given application scenario is the full query expressiveness of a relational database to be available at the service boundary, then you can simply exchange queries over the wire (and manage the security implications of that). For joins that can be modeled as association traversals, then data services already has support for them.
I guess this is a long way to say that Data Services is not a solution for every problem that involves exposing data to the web. If you want a RESTful interface over a resource model that matches our underlying data model, then it usually works out well and it will save you a lot of work. If you need a custom inteface or direct access to a database, then Data Services is typically not the right tool and other framework components such as WCF's SOAP and REST support do a great job at that.