I've just started playing around with the new "Website" feature in Azure that allows you to create websites with just one step - and also allows you to create websites from a "Gallery", including Drupal. And I can get my Drupal site up and running, no problem. But if I try to add a third-party module (for instance, Mindtree's ODataDrupal), then I get this error message:
Installation failed! See the log below for more information.
odata_support
Error installing / updating
File Transfer failed, reason: Cannot chmod /DWASFiles/Sites/theparentsunion/VirtualDirectory0/site/wwwroot/sites/all/modules/odata_support.
More-or-less the same thing happens if I try to update some of the existing modules (which Drupal warns, with big red flashing letters, are out of date), except then my Drupal install is left crippled, with no way to fix it that I've been able to find.
Is this as-designed, or some limitation of the beta website integration? (Because a Drupal installation is kinda worthless if you can't add new modules to it, or update existing ones.) Or am I doing something wrong?
If you are trying to use plugins and 3rd party modules to Drupal based Windows Azure Websites, the results may vary person to person. This is mainly because the kind of configuration needed by specific module or plugin may or may not be supported by Windows Azure Websites model and not all kind of custom configuration will work on Windows Azure Websites and you would need to move to Windows Azure Virtual Machines.
About application specific structure, what you can do is open the websites FTP folder and whatever you could see there is user configurable, so you can configure it the way you want. However if you application will try to make changes to outside its limited scope, you will hit errors as above.
Here is a case study where Azure VM was used for Drupal based migration which shows that for complex application you may need to use AZure VM rather then Azure Websites.
Related
I am new to Git and want to use Github for version control on a website that I have already created but will still need to modify now and in the future. My institution uses Gitlab and I would like to make this repository available to them internally, but I also want to host the repository for our team's website privately on my Github. This is mainly to use Github copilot, have an easy integration with VSCode, and grow my proof of work on Github.
I would like to make an automated pipeline as follows:
VSCode/IDE ( only direct development environment) → auto push to Github (viewable: private) → sync to Gitlab (viewable: internal organization) → modify Plesk file manager (my host control panel, integrated if needed) → Wordpress (live changes auto-deployed due to the Plesk modification)
The idea is I would just push commits locally from my IDE/ VSCode and have auto-deployment on the website with Github and Gitlab functioning as intermediaries for version control and internal organizational access.
(For reference: Right now, I am using Plesk to modify the code on the site, but I have to copy and paste back and forth with VSCode because I perfer its feature rich environment and very low level "versioning" by just saving the file. It's an extremely inefficient system currently which does not tolerate errors well and it doesn't allow for organizational access. The site isn't yet live, it is still "Coming Soon" but I will need to retain all of the files currently on the site and be able to modify them before and after the site goes live from now on.)
Does anyone have experience setting up integrations/automations like the one outlined above? Do you have any tips or know of any guides/tutorials to creating such a workflow? I'm new too all of these technologies other than the IDE, so I don't want to mess anything up while setting it all up.
Thanks wonderful humans! :D
I've made accounts for all technologies listed above but haven't started the integration process on any of them because I can't find a consistent best place to start given that I already have a fully developed website. I don't know if I should copy Plesk into Git, which I should start with, etc. I really need a step by step. Thank you again!
I created desktop app for Windows (running mostly on Win 10) using QT libraries. Explicitly in my code, I don't perform any operations that require administrator rights, especially writing to "Program Files" etc - application uses local app data folder structure (I double checked this going deeper and deepr into this matter).
In my manifest file application also doesn't need admin privileges (it's as invoker).
However, my application still requires admin rights to run.
My question is not about how to solve my specific case, because I established that it's because deep dependencies hidden in QT libs to Windows API and these calls often require admin rights in case of operations that seem to not exactly need it like drag & drop or network connection with specific IP address.
I followed it using Microsoft Standard User Analyzer (SUA) tool on my executable.
I'm putting here example log from SUA investigation:
In detailed info for pos. 1-2 I can see it's because:
However for 3rd position, it is even more complex problem related to PROCESS_QUERY_INFORMATION access allowed only by elevated processes. Example stack trace (one of many many more):
Summarizing - my question:
You can believe me that I don't perform any operations that require admin rights from "normal", common sense point of view. Moreover my customer have old application written in .NET env that doesn't need admin rights and does the same things in general (I mean nothing "special").
What is a general way to overcome such problems with QT development environment?
Or using QT everyone takes a risk that the application mostly will require admin rights?
I have been developing a Drupal 6 site on my PC using XAMPP. I'm done now, and everything looks peachy.
Problem is, I need to put all my content (including custom modules and themes) up onto a staging server which only has a fresh Drupal 6 install on it. I can't imagine having to set up all my custom content types and whatnot all over again on the staging server.
So I ask, how does one go about doing what I need to do? Which is essentially duplicating my Drupal install from my PC, to the staging server.
The staging server is running Linux, and I develop on a Windows PC, if that helps.
Thanks in advance.
Copy up all the files from development to live, and mysqldump your database and run that on the live server. Then all you have to do is change the settings.php file to point at the right database, if for some reason 'localhost' is not also your mysql database.
The quickest solution is probably the backup_migrate module. It is only a tool to copy your database. You could also use phpmyadmin or similar instead if you wanted. The backup_migrate module do have some good defaults settings as to which tables to skip (like cache tables). All the settings etc. that is not defined in code is stored in your db. So you only need to copy the db to be set. You can choose to exclude some tables, like the node or user table if you don't want to bring over your test data.
If you don't use subversion, then you gotta manually copy the files (rsync, scp, whatever) and the db (mysqldump).
what we usually do is have a hierarchy of independent subversion repos as follows:
core
sites/all/modules/contributed
sites/all/modules/custom
sites/all/themes/ (we develop our own and don't use contributed themes)
sites/all/libraries
then we use the svn:externals properties so that if you check out "core" you get every associated repo.
we got about 2 main developers with 4 other guys that may also contribute code to the site. each have their own local dev environment and we all got a common sandbox - where we make sure the stuff we wrote doesn't break someone else's module (it has happened before!).
we use svn commit hooks to update the beta/staging/sandbox site upon commit.
with all that setup, [re]deploying a site is a simple matter of going to the proper folder and issuing a "svn co http://repolocation/reponame ." and then updating the DB.
two last things to consider:
we are moving from svn to git
the features module will allow you to save changes you make to your own modules (views, content types, etc) and package all that into a deployable module so you don't have to duplicate your efforts. we are also looking into using this for ourselves.
I hope this helps you.
I second using backup_migrate. It's great.
When I'm installing a fresh site from development to production, I:
backup the site using backup_migrate module
copy all the files up to the server
edit the sites/default/settings.php to have the right database path and account info
do an import of the last backup_migrate dump (usually using mysql < backupfilename.sql, unless I already have drupal setup and have backup_migrate installed, then I use the GUI
But take a look here for the official version:
http://drupal.org/node/776864
Now, you didn't ask, but when the site is live and users are contributing content, moving future development versions of your site from development/staging to production without blowing away live content is a whole different problem, and one that Drupal doesn't have a good answer for...
Andy-
Can you help me to understand, how do I do Drupal website deployment and development?
Suppose, I developed 1.0 version of Berty&Frank website. I copied everything to their production server and it is alive and kicking now. Site is already full of contents and is growing.
I am asked to add additional features to the website. I am now experimenting with the way how I can implement them in a dev version. I am creating/deleting content types, fill created nodes with demo data just to see how they look like etc. Now I found the way and I want to upgrade production website to the same structure as my dev version now. How do I do that?
Is the only way to manually make every change I made in dev version?
I would explore the Aegir project for the future management of your website. It allows you to clone a site, then to upgrade the site to a new "platform" which could be the next release of Drupal or another Drupal system (such as OpenAtrium).
More can be found at the aegir wiki.
You can export/import views and contenttypes, but a lot of settings etc is stored in the db. This gives two options
Either to use something like backup & migrate to import your settings from dev. This wont work if you have test data though, as you would overwrite the db.
The other options is to repeat what you did on the live site.
A third options could be to take a fresh dump of the live site, do all the settings in that db in dev environment and overwrite the live db with that. You could loose some comments etc, but shouldn't be a big deal.
I use Subversion, and just do an update on my production server when I am satisfied with the code on my development server (actually, I have a staging server that is a duplicate of the production machine, so I update that before the production; I can see any bugs that might pop up).
For database changes, I haven't found anything better than just keeping track of my changes (usually adding/modifying CCK fields) and performing the same changes to the production database. I also download my production database regularly, so that dev and staging have almost the same content. That helps to minimize the confusion.
read http://www.drupal.org/upgrade/
I've always deployed my web applications via FTP (sometimes even xcopy), and then manually run database scripts myself.
I started deploying this way in the 90's, but lately, I've seen a few web apps with installers. I'm starting to question, if I'm locked into an out dated process. I'm a consultant, my apps are usually internal, so I don't worry about distributing and having others installing them.
But I'm curious; does anybody create installers to deploy internal asp.net web applications?
If so, why? (Voluntarily, mandated, or part of an automation process)
And have you had any problems doing it this way?
absolutely. We use it to do all of our apps. That way we create the installer and run it on the qa and uat environments to test and we know exactly what is going to happen in production. There are no guesses as to what order someone might do something in, or if they miss a step. It makes things a lot easier.
Ooh I forgot about the automated process too. We have systems in place (Ant Hill Pro) which automatically deploy it to the proper environments. The qa people don't have to wait for something to be done, because it's all done at 2 am. If they need to rerun the build with updates, the devs check the code in and we push a button, and it's automatically deployed. No waiting for the build engineer, because he's in a meeting or sick or whatever.
You always want to have an automated way to build and deploy - it greatly reduces the chances of a one-off error if you forget a certain step. Also, it allows you to offload the deploy to someone else easily without having to teach them 100 customized steps. Whether the project is internal or not, all applications should follow best practices.
Personally I'm a bit like the OP; generally I just deploy using FTP, but in saying that typically my applications are internal, or in the case of other projects, 100% managed by me.
I've also been thinking about this lately however, and have started to think about how using proper deployment may improve the process - having to document a detailed install process can be a real pain.
I use Powershell and found really easy to automate lots of tasks. You will probably find a bit different at the very begining but at the end you will see that it's all about the power of the .NET libraries !!!
I have use the "Web Setup Project" to create an MSI that installed the output of a "Web Deployment Project" for an internal app. Our server admin wasn't up to the task to doing a 50 step manual install. For my current app, my server admin doesn't like the 'black box' feel of MSI installers and prefers getting a pile of files and a 50 step deployment manual. (See a pattern here? Ask your server admin what he wants.)
The Web Setup Project doesn't make it immediately obvious how to install to anything other than the "Default Website", other than that, it made the installation process repeatable and created a built in way to rollback (by just running the installer from 1 version ago).
This of course assumes that your virtual directory doesn't hold any user modified content-- I wouldn't trust an MSI to properly merge user created and new files.
We use the "XCopy" deploy model here, since the Ops folks have their own method of setting up security on a new web application on the server.
However, we did need to use an installer when we had to install a web application that was using a newer version of Crystal Reports since it had to do something special with a key and we didn't have a full blown version of CR on the server itself. So keep that in mind when working with third party apps, they may need to do some kind of merge module that the MSI handles easily.
Yep...we have an app that needs a lot of pre-requisites set up....web service, windows service, user accounts, security, folder creation, GAC bits etc....I rolled it all up into a nice MSI with custom actions that can install and uninstall cleanly. Saved about one hours worth of work to deploy on a new box.
A lot of the other smaller apps are just deployed by doing Publish Website to a local folder then ftp'ing the contents to the target.
It greatly depends upon the scale of your project, your enviornment and your internal user base. I rarely deploy with an msi because we are too small an operation to have multiple environments (except for SharePoint, that's different all together) . We develop and use VS to deploy web apps to a development box, assuming they are approved then we use VS again to deploy to the live box.
The only proviso is that we have multiple copies of the web.config (appended with test, dev and live) and we then delete the suffix off the relevant file depending upon where its been deployed.
It's probably not the best methodology (I know it's not), but it works and it aids rapid deployment of small to medium sized solutions in a small-scale user environment.
F5ToDebug...
Your saying its OK to take short cuts if you dont have time to do it properly?
"who's going to test the code on the test environment?" You said it yourself that you have config files for _test - why would that not be a suitable test?