I have a content type in Drupal 7 that consists of 95% of the nodes on my site. I have no use for them now and they have been unpublished.
My question is if those unpublished nodes could be slowing my site down? Would it the site run faster if they were all deleted? I have no use for them at all now but there are over 100000+ pieces of content and I don't know how to delete them except 50 at a time. Thanks.
well yes, they probably can slow down your site. it depends on mysql queries which are you using. if you are really sure you will not need them, than use this module: https://drupal.org/project/delete_all, it will delete nodes by content type "drupal way" = correctly
if you dont want to delete them, you should optimize all of your queries. so try to log slowqueries for a few hours/days and work with them later. this can slow down every page by more than one second if queries are not optimized correctly.
Not necessarily it is slowing down. If there are no views or custom code looking for these nodes, and they are not showing up anywhere, probably they are not causing this problem. In a simple comparison, it's like saying your computer is slow because you have a lot of files in a folder.
But, if you aren't using them and aren't planning to do so, you should probably delete them.
To discover what may be slowing down your site you can use xprof and Drupal Devel module and/or any mysql profiling tool.
Related
I've built a complex site for a client, who wants this duplicated, and re-skinned, so it can be used for other means.
What is the best way of doing this? I'm concerned about copying every file as this means any bugs must be fixed twice, and any improvements must be implmented twice.
I'd look to refactor your code.
Move common functions into a library you can reference from both projects. As you mention that the new site is for a different purpose then you are likely to see divergence and you don't want to hamper yourself later, so extract the common parts and then modify copies (or if appropriate new files) of the remainder to complete your fork.
If you haven't applied good practice already then now is the time to do it and it'll make your work on both sites easier moving forward.
If all the functionality is the same and only the layout is different you could just create a new css file. 2 websites could have exactly the same code base but have different stylesheets and look completely different.
I think that using a version control system like subversion or preferably git, is a good way to duplicate your website. You will be able to track the changes that you make and revert to older versions if things do not work out.
You should implement some kind of instantiation, so look and feel, content and data will be shown depending of what instance of the application is accessed.
In other words, each application access to the code with a different application identifier, meaning content will be served depending on it.
Both application identifier will be pointing to different settings, so stylesheet and content will be absolutely isolated, and both domain will be living in the same IIS application.
If you want to duplicate a whole site it's probably best to copy the whole thing and amend as necessary. Obviously taking great care not to copy large portions of text or else you may be penalised by the search engines.
There are ways you could put the new site onto the same shared host (say within a subdirectory of the original site) and literally 'share' some files. If a unique change is required, you could instead reference a 'local' version of a particular file.
However that sounds like a recipe for a headache to me. I'd prefer to duplicate the whole site. It would be much easier to replace one or two functions on separate websites than it would to try and work out which website(s) are affected by a particular change to your source.
Three associates and I want to integrate our individual Drupal websites so that a user can move fairly seamlessly between them. We're all new at Drupal, so our planned approach avoids "doing it the right way" by combining modules and database tables.
Rather, we plan on simply having each site's menu system include links to the other sites, and load the selected site via Iframes so that the overall user experience is more like that of a single, integrated system. We'll adopt a common theme for all sites, and pass the user id through the HTML call (and then process it via normal Drupal code) to avoid the need for more than one logon.
What are the negatives of this simple approach and are they so severe that a more traditional site-integration approach should be used?
To be honest, that sounds like a rather nasty can of worms you're looking at opening there. The mere mention of IFrames has me shuddering!
It seems to me like you'd be better off simply having one Drupal instance, with you and your associates as different content authors on the same site.
If you're looking at having the same theme across the three integrated sites, how will the users know which one they're on? And if the aim is to tightly integrate them, why not have the four of you simply contribute to the same core site?
If I had to make the decission, I would use the drupal multisite feature. You can even use the "single sign on" module to get all your users logged in to all sites. It is a bit of work, but I think it is well worth it.
Once you start throwing things into frames your users/visitors will loose the ability to bookmark the correct page. For example, if they find a page they like and book mark it, they will get 'www.site.com/index.php' rather than 'www.site.com/article/article.php?Id=12345'. When they come back, they'll be getting the default page of where the frame lives at rather than the expected page.
Since all three of your sites are based on the same data scheme, it would probably be better to 'do it right' the first time around rather than hacking something together that in the end will cause more headaches than solutions.
Good luck on your project and hope this helps some.
I want to know what is the easiest way to export ALL content of a specific user in Drupal, by only making use of the database. Is this even possible?
Reason being, I moved a site, and now clients created content on the old server, unknowingly, and I need to move it to the new server. Unfortunately, the site can't be accessed anymore (due to the move) so I only have access to the database.
Should I perhaps look at finding a way to maket the isntallation accessible and then using an export mdoule, or is there an easy way to export using PHPMyAdmin?
Ouch
You could take the db and set it up on a sandbox somewhere, which would give you access to see what changed. Genrally if you have access to a DB (and know which version of drupal and modules you had), you can run a drupal site from it.
In an earlier question I suggested the migrate module for getting content from one drupal db to another, I think that appies here. If you can't get another code base to look at the old DB.
Instead of trying to pinpoint the content that a specific user has created, it will probably be a lot easier to get the stuff you need based on the datetime. But it really depends what kind of content that have been created. If it's just nodes, it should be fairly simple to load the nodes from the one database and save it to the other.
Another thing worth mentioning is that Drupal support having more than one database in your settings. You can relatively change the db connection if they are of the same type (e.g. MYSQL). See the db_set_active function.
You can also try to make use of the migrate modules like Jeremy suggested, which way to go depends a bit on how well your Drupal/PHP/SQL skills are and how tricky it is to get the data you need.
I think this module will help you
data_export_import
I know you already fixed this, but what about the backup_migrate module?
Today I'm on an asking spree :P
Anyhow... Right now I am developing a free WordPress theme. The problem is that I want to make it as flexible as possible so that is why I will use some theme options to set some CSS colors, widths and so on. My question is this:
If I assume that one of this themes will be used for a heavy traffic blog, how this will affect server performance? I will have an increase of SQL queries? Or something else (wrong) ?
I do not think there will be an increase in the number of SQL queries. Unless, of course, you decided to extend the worpress functionality by making the theme somewhat data driven.
However the size of your templates/images/CSS/javascript files may have some impact on the performance of the application.
As a general rule of thumb, if you are concerned about the performance of a web based app, it is always good to keep your files as light as possible.
Anyone using WordPress for a high-traffic blog is almost certain to be using WP Super Cache, which means almost all pageviews will cause (depending on whether the super bit is being used) either 0 or 1 SQL queries, regardless of what your theme does.
Wordpress isn't renowned for being gentle on the database - although I guess there might be intentions to improve over time.
So you're not causing big problems by adding an extra query.
But keep it to one query: have all your options so that one SELECT will get them all, and call that once per page load.
Alternatively, don't store the options in the database. Have a config file that lives in your theme directory.
You shouldn't really be adding anything that does anything but provide styling to your theme, if you want to keep database loading low. (Of course, I mean other than the default data retrieval functionality found in pretty much all themes.)
The only parts of your theme that should really be doing any significant querying are:
The regular bits and bobs in the loop (for the main blog page and archive pages)
The comment retrieval (for single-view posts)
The sidebar (presuming you're going for a widget-enabled sidebar)
I'd suggest leavingextra functionality (and, therefore, extra DB queries) to plugins - a theme should focus purely on aesthetics.
I built an online news portal before which is working fine for me but some say the home page is slow a little bit. When I think of it I see a reason why that is.
The home page of the site displays
Headlines
Spot news (sub-headlines
Spots with pictures
Most read news (as titles)
Most commented news (as titles)
5 news titles from each news category (11 in total e.g. sports, economy, local, health
etc..)
now, each of these are seperate queries to the db. I have tableadapters datasets and datatables (standard data acces scenarios) so for headlines, I call the business logic in my news class that returns the datatables by the tableadapter. from there on, I either use the datatable by just binding it to the controls or (most of the time) the object converts it to a list(of news) for example and I consume it from there.
Doing this for each of the above seems to work fine though. At least it does not put a huge load. But makes me wonder if there is a better way.
For example, the project I describe above is a highly dynamic web site, news are inserted as they arrive from agencies 24 hours non-stop. so caching in this case might not sound good. but on the other hand, I know have another similar project for a local newspaper. The site will only be updated once a day. In this case:
Can I only run one query, that would return a datatable containing all the news items inserted for today, then query that datatable and place headlines, spots and other items to their respective places on the site? Or is there a better alternative around? I just wander how other people carry out similar tasks in the most efficient way.
I think you should use FireBug to find out what elements are taking time to load. Sometimes large images can ruin the show (and the size of the image on screen isn't always relative its download size).
Secondly you could download the Yahoo Firefox plugin YSlow and investigate if you have any slowing scripts.
But Firebug should give you the best review. After loading Firebug click on the 'Net' tab to view the load time of each element in the page.
If you've got poor performance, your first step isn't to start mucking around. Profile your code. Find out exactly why it is slow. Is the slowdown in transmitting the page, rendering it, or actually dynamically generating the page? Is a single query taking too long?
Find out exactly where the bottleneck is and attack the problem at its heart.
Caching is also a very good idea, even in cases where content is updated fairly quickly. As long as your caching mechanism is intelligent, you'll still save a lot of generation time. In the case of a news portal or a blog as opposed to a forum, your likely to improve performance greatly with a caching system.
If you find that your delays come from the DB, check your tables, make sure they're properly indexed, clustered, or whatever else you need depending on the amount of data in the table. Also, if you're using dynamic queries, try stored procedures instead.
If you want to get several queries done in one database request, you can. Since initially you wont be showing any data until all the queries are done anyhow, and barring any other issues, you'll at least be saving time on accessing the DB again for every single query.
DataSets hold a collection of tables, they can be generated by several queries in the same request.
ASP.NET provides you with a pretty nice mechanism already for caching (HttpContext.Cache) that you can wrap around and make it easier for you to use. Since you can set a life span on your cached objects, you don't really have to worry about articles and title not being up to date.
If you're using WebForms for this website, disable ViewState for the controls that don't really need them just to make the page that little bit faster to load. Not to mention plenty of other tweaks and changes to make a page load faster (gzipping, minimizing scripts etc.)
Still, before doing any of that, do as Anthony suggested and profile your code. Find out what the true problem is.