I am tryin to rebuild a website designed in asp.net.But,being an informative site it has a lot of content in it which i dnt want to input to the database manually.
can ne1 help me out with any drupal module to migrate the asp.net data in sql server to mysql db in form of drupal architecture??
say for example a page in asp.net is displaying data from a form.Can i migrate it in form of cck and get the page data as nodes table.
The comparison table of all available solutions is useful, but I would strongly recommend the Migrate module for this. It should work with MS SQL source data (I've only used it with MySQL and Postgres source databases), and is quite flexible in regards to mapping the source data to the Drupal schema. For Drupal 6, it supports most CCK field types (including images and files), and for Drupal 7 it supports the new Field API quite nicely.
The approach that the module takes, allows you to do continuous migrations such that if your source ASP site is still adding or changing content, it will keep track of what has been migrated, and even allows updating of previously migrated content.
This page have a comparison table of the main modules that may help you importing the data into Drupal. But keep in mind that you'll have some work to do before actually importing, since there's no magical way for Drupal to map the columns of your database to the Drupal columns.
Related
We have an old site using umbraco 4, and we are trying to migrate the blog posts (which may be around a 1000) to a new site in Umbraco 7.
We've tried searching online, but results are mostly paid services who cannot migrate into umbraco as well.
Anyone have any ideas?
Appreciate the help.
We had the same problem exactly a year ago. I can't script this for you but the solution is this:
Go into the database and check the tables. Go in the current Umbraco 4 system and check which fields exist and which ones have content that you want. Then go to the Umbraco 7 system and create exact same doctypes, fields, tabs etc with the same names.
Do basically the same with the database. Then pull the data from the old one to the new one with a database script.
Wish you luck, took me quite a while.
There's no tool I know that will do this for you effectively.
Soeteman Software's CMS Import tool is perfect for this - there is a free version capable of importing 500 records at a time that may suit your purposes:
https://soetemansoftware.nl/cmsimport
I'm a bit of a Drupal newbie but have committed myself to porting some quite complex bespoke websites.
One of these is a surf report site which uses a database with millions of time/location rows to get the local conditions. Obviously can't write this to a Drupal table everyday so am looking for a way to connect and retrieve live data.
I'll need to use the external data in a number of displays which would normally be nodes, views, blocks etc.
The ideal solution would be if the external data just appeared as local data, but I'm guessing that is a rather big ask.
I'm open to any approach. If it involves writing a module some pointers on what to look at (preferably code examples) would be much appreciated.
Thanks,
Chris
Maybe you can use the Data module.
From the documentation:
The Data module lets you use database tables that are foreign to Drupal.
You can 'adopt' a table in your database that ordinarily Drupal would not be aware of. This might be external data you have imported, or a table that another application has created.
It also provides Views integration.
I had to show info from a legacy staff databse in my Drupal 6. I have explored three ways:
Importing database rows as nodes in a nightly cronjob
Accessing the external database in my own module
Accessing the external database using hook_views_data()
If I started again I would use hook_views_data(), it's more flexible, specially if you want to match the external database information with drupal nodes.
Best.
Have a look at the Forena module which can be used to display (= query, not update) data stored in databases external to Drupal. It comes with a full suite of Supported database connections, such as MySQL, MS SQL, Oracle, Postgres or any PDO compliant variation.
For more details about Forena, 2 types of documentation are available:
Community documentation.
Documentation that comes with Forena, which you can access right after install and enable of the module. Checkout the demo site for an online example of the Forena Documentation.
The Sample reports and graphs are fully functional, such as the drill downs available on the Master Summary with drill down. Using Forena "skins", you can also use the amazing "dataTables" (and its widgets, such as sorting by selected columns).
Forena also includes a UI for either creating your reports (the WYSIWYG Report Editor) and/or for creating your SQL queries (the WYSIWYG Query Builder).
Disclosure: I'm a co-maintainer of Forena.
We have several Drupal 6 sites. On most of the sites we will have some content editors, who will optimally log into a dev/test site and create content. Mostly just plain vanilla pages. What is the best method for migrating these newly created pages to the live server? I know there is the Deploy module and also Backup and Migrate. Are these the de facto standards? I was wondering if there is anything else I might be overlooking or if there is a better/easier solution. I am ONLY interested in moving content. We will be using GIT to move code and the Features module to migrate admin changes.
THANKS
In the past I've used node_export and recently I've found (in drupal 7) that the uuid module is invaluable. It means you can keep a track of content even if the nid's change as you move it over.
There are some legacy modules hanging around too that did an okay job but their functionality has really been surpassed by the modules you mention and the ones above.
If you're creating all your content using fields added with cck, you shouldn't have a problem. I personally use node_export along with features to allow easy importing and updating.
One gotcha you might want to look for on node_export is when importing on the new site, if a node is found with the same uuid (i.e. you're doing and update), the default is to create a new node. I prefer to create a new revision. Worth tracking down that setting (it's there in d7 so likely in d6 too).
EDIT: node_export doesn't currently export panels very well if at all - just in case.
It depends on the structure and diversity of the nodes (how many fields of which type, how many node types). My first try would be to generate a view with XML output (views_datasource.module or views_data_export) on the dev site and use the feeds.module for XML import on the live site.
The above XML export modules are avalable for D6 and D7. In Drupal 6 I used views_bonus.module for the XML export.
I'm curious if a CMS like Drupal or WordPress is a suitable approach for creating a data-centric web application. It's nothing fancy, but would require a login to access, and essentially have search parameters, results pages, and detail pages. I can program this pretty easily but would prefer to just theme Drupal or WP, since I'm not much of a designer. Thanks for any insight, as well as any examples of where this might currently exist!
-e-
Sure, Drupal is suited for data-centric websites. Development Seed is a company that use Drupal and make data-centric websites but they also use other technologies such as Node.JS. As suggested by Marek Sebera, it really depends on your actual needs and what exactly your site will be doing with the data.
You basically have two ways to integrate your data into Drupal. One is to import the data into Drupal as entities (a generic concept for anything data in Drupal 7) or nodes (a concrete concept for anything content in Drupal). The other is to directly query an external data source from Drupal.
To import data, the Feeds module is a very solid and flexible solution. Out of the box it only supports CSV, Atom and RSS from local files or HTTP. But it is flexible and extensible and there is already many contributed modules that extends it.
To query the data, both external or imported, you can use the Views module to build query pages. For external data source support, you will need additional backends modules and probably have to write your own.
If you import the data as entities, you can also uses the core's EntityFieldQuery to write your own query without using Views. And you can also uses the Search API to index the imported data using a search engine such a Apache Solr.
my next assignments is to build 2 information portals for customers. These portals will be login protected sites and contain a set of pages displaying information like orders, invoices, pdf-files ... for the authenticated user (all presented as lists with links to detail pages). The users and the data are stored in an Oracle database. The portals differ in some of the features and in the layout.
My standard approach is to build an individual ASP.net Web Application for every portal.
But this is not the best way to get something reusable. So for these two projects my idea is to create a set of WCF services to get the Data from the Oracle database and to build user controls to display the different elements in Umbraco. This way I hope to get a set of independent, reusable “modules” which can be used to build these portals.
Now my question: is Umbraco a good platform for this type of projects? And is my “concept” a valid approach?
Kind regards
Volkmar
Umbracois very flexible. ON the one hand there is the question about security: With Umbraco you can use any Membership Provider you want for all visitors ( also with member roles).
On the other hand you have the question of the integration: With Umbraco you can create usercontrols, xslts or razor files as macros (which can be seen as the reusable modules).
For Xslt you can implement your own XsltExtension which pulls the external content as XPathNodeIterator you can use in every Xslt macro. For ascx files or razor you can use LinQ2Umbraco, your own objects etc to connect to the oracle database.
You also can use some sort of caching functionality to reduce the db-calls. On the other hand is one of the biggest advantages that Umbraco stores all the content as xml and object tree in memmory. So it is very fast in content rendering. With every database call you are loosing a little bit of this advantage.
hth, Thomas
Ruben Verbourgh began the Oracle4Umbraco project to create an abstracted fork for the Datalayer to support running on an Oracle DB. You can find it at http://oracle4umbraco.codeplex.com/, although it has no active releases, so build from source and YMMV.
Volkmar, your concept is perfectly sound - although you might want to consider using the Umbraco data store as the persistence layer for your data rather than in the Oracle DB itself. You get XML content versioning, caching, and all the benefits of the content-management side of things, in a robust and flexible framework which you can expose to other apps later should you so need to, through the Umbraco APIs and web services.
HTH,
Benjamin
content management of website becomes simplified with Umbraco.
But if you are planning to use Oracle as backend, Umbraco does not have support for it.
So decide carefully as to what parameters can be compromised.
Good luck.