I just started a new position replacing a developer who left abruptly working on a project that is based in the Kentico CMS. I am completely unfamiliar with ASP and Kentico, so the answer here needs to be tailored for a total beginner. I am familiar with other languages (PHP, Ruby, SQL, etc.) but have no idea where to begin with this.
So, want I am wanting to do is copy everything from our production site (db and all) to my local machine so I can develop on it easier. I have already exported the db into an SQL file, and copied all the files in our Kentico Instance folder into github, and cloned it on my local machine. I assume since Kentico is already "setup," going through the installation process in their documentation is not the way to go about this.
Any help would be incredibly appreciated!
David, basically there are a few "pieces" to running Kentico locally. Since, as you mentioned, Kentico is already set up, you should have an easier go of it.
A database with the necessary Kentico tables (it sound like you
already have this)
The codebase (all of the code files that you copied to github)
A valid license for any domain you want to run Kentico on. Was the site already public facing? Do you know what licenses you have
or can you log into the CMS Desk on the site that you copied
everything from?
Set up IIS for your local website. If you are unsure on this one I can explain further, but basically you need to add a new site,
point it to the root code folder for your site, and set the domain
to be a domain you have a Kentico license key for. You'll also need
to change the app pool settings to "integrated" mode (most likely)
and also set the appropriate version of .NET (if it's a recent
version of Kentico you'll want .NET 4.0)
Next you'll need to edit your hosts file to add the domain and point it to your localhost IP address. So add a line like "127.0.0.1
dev.yourdomain.com" or the equivalent.
Edit the web.config file so that your code can connect to your database. You will need to edit the connection string accordingly to
point to the database on your machine.
Once you have done these steps, your site should start to run just as it had before. I didn't give great detail on all of these pieces so let me know what problems you encounter so I can further clarify. More information about the current situation would also help.
One other note I would make: if you need the client to be able to review your work, it will most likely be more efficient and easier for you to leave the original database on the web server, and (if possible) connect to it remotely from your local machine. Since almost any change you make will result in a database change in Kentico, I find it much easier to be working on 1 database for development with distributed codebases. Otherwise you will probably need to overwrite the other database with your changes constantly and this can be annoying. If you leave the database on the server and just connect remotely, you can just ftp (or use git) to push files to the server that you have edited locally.
Related
I have a simple Wordpress website that is created using twelve twenty. I need to transfer it from server to my PC to be able to edit it and transfer it back to server. Now that I copied wordpress directory from server to my PC it asks me to reinstall it. How can I do it? Am I missing something? Is it possible to work on website locally and once it is done transger it to the server?
Reason for doing this are:
-don't want to loss data for wrong doing things.
-have a copy of website on local machine.
-easier to work on the local machine offline rather than bing online and accssing the server.
From my point of view , i would suggest u to back-up full website and database from server [every hosting control panels has option to backup same]
Connect via ftp and edit ur files.. there to make lots of changes when u want to move from local to server or server to local.
cheers!!!
Editing the files on a live site directly is a terrible, terrible idea. It invites any number of points of failure. If you're actively developing a site things will break at some point - it's part of the developing process - and you definitely want to break things locally, not on a live site.
There's actually several steps involved, all of which are too long to go into great detail, but here's an overview of what is required along with a few links to get you started. The first time you do it seems laborious, but once you figure it all out it actually only takes 10 minutes.
Firstly you will need a local environment MySQL and PHP environment to install it. As you're on PC look at WAMP; here's a fairly good tutorial on installing it.
If you have a lot of content on the live site you might want to import it into your local environment; you'll need to export the database (probably using PHPMyAdmin) and then import it into your local database. You will need to update a couple of database options so that it points towards your localhost. It's basically the reverse of the process detailed here: you'll be changing your-site.com to localhost:8888. If your site is relatively simple you could skip this stage*
Now you should be able to update your local copy of wp-config.php with the database connect details for your local database (usually it's just localhost for the host name and root for the user and password`). With that in place you should now be able to install WP.
Now that it's installed you can edit away to your heart's content on your local copy, safe in the knowledge that anything you do on your local copy doesn't affect your live copy. When you're ready to push your changes live you can use FTP to copy your local files to your live environment.
* For a lot of projects I don't actually go to the trouble of synching databases - if there's a live site that the client can change the content on it often becomes a futile exercise attempting to synch a database with a moving target. In those instances I'll just use a comprehensive set of test content that contains every conceivable type of content that could possible inserted in to the live site.
your going to love working on your project from your locally. Here's an article I found on wpmudev http://premium.wpmudev.org/blog/how-to-install-wordpress-locally-for-pcwindows-with-xampp/ which gives you a step by step guide for PC / Windows.
I am currently using Komodo edit for coding, and have a set up where I am using MAMP and a local install of Drupal, and SASS to build my site offline.
Once its ready test online, I upload onto the remote site. However, then I am working sometimes on the remote site and sometimes on the local one, and finding some problems.
I'm not using SASS on the remote site, so I am working in the CSS file. I don't have all the Drupal panels in code yet, so I'm having to rebuild them on the remote site, and I'm making tweaks and changes on the fly.
I end up with two slightly different versions of the site and need to keep track of the changes I want to keep from both. What can I do to clean up my workflow?
It would be better if I could work entirely locally and then sync that to the remote environment.
With something like Netbeans I think I could have a local copy of the site running and then right click and upload each file onto the remote server so there are two copies of the file.
I could do with some advice as to what the cleanest set up is.
I have an actual Dev server with its own ip and a live server with its own ip but they both connect to the same mysql server. There are two databases set up, db1 and db2.
I use a php script with basic sql instructions to:
Check which db is in use.
Backup that db (say, db1).
Sync the databases
Import that db into db2.
Once that is done, I use this:
cp webdb2.settings.php settings.php
So I always have two databases available and can roll right back (in this case with:
cp webdb1.settings.php settings.php) if something when wrong.
This seems to be a pretty good system. I only ever work on the dev, and then push it to the live with the process above.
The best thing that you can do is work offline an then sync the content (themes/modules) when you have to.
If you work on both you could lose data or lose track of code changes.
I'm trying desperately to move from VSS to a real source control system. Options include TFS and SVN.
My designers need to keep their ability to modify source files and instantly preview their changes in a browser without having to commit their changes. Using FPSE with VSS, this works flawlessly, since saving a file causes the copy in the working folder on the dev server to be updated, so they can just save and refresh their browser which is pointed at the dev server.
The site in question consists of 350k+ lines of classic ASP code and some new ASP.NET MVC. They only need to be able to modify views within the MVC code, not C#.
Though Expression includes a version of Cassini for local debugging, Cassini does not support classic ASP.
Surely someone has solved this problem before. It can't be necessary to install IIS on each designer's machine (this is absolutely untenable). I need a way to have a common working folder on a dev webserver updated whenever someone saves a file locally, just like using FPSE.
I'd rather not write an FPSE proxy that knows how to talk to TFS/SVN. Any suggestions?
(I know I've asked this question in the past, but I haven't yet found a solution.)
Why the need to copy the source files when they are saved, why not simply save the files to a network share and work on them directly? If the dev server is constantly being overwritten after every save anyway surely the effect is the same?
This probably won't be as instantaneous as you like, but with TFS you could set up a Continuous Integration (CI) build that builds and deploys the project to a test server on check-in. If you do this, you'll want them checking in to a QA type branch, then, once they are happy with how they look, they can then merge to the mainline branch for the real build and integration.
I have been developing a Drupal 6 site on my PC using XAMPP. I'm done now, and everything looks peachy.
Problem is, I need to put all my content (including custom modules and themes) up onto a staging server which only has a fresh Drupal 6 install on it. I can't imagine having to set up all my custom content types and whatnot all over again on the staging server.
So I ask, how does one go about doing what I need to do? Which is essentially duplicating my Drupal install from my PC, to the staging server.
The staging server is running Linux, and I develop on a Windows PC, if that helps.
Thanks in advance.
Copy up all the files from development to live, and mysqldump your database and run that on the live server. Then all you have to do is change the settings.php file to point at the right database, if for some reason 'localhost' is not also your mysql database.
The quickest solution is probably the backup_migrate module. It is only a tool to copy your database. You could also use phpmyadmin or similar instead if you wanted. The backup_migrate module do have some good defaults settings as to which tables to skip (like cache tables). All the settings etc. that is not defined in code is stored in your db. So you only need to copy the db to be set. You can choose to exclude some tables, like the node or user table if you don't want to bring over your test data.
If you don't use subversion, then you gotta manually copy the files (rsync, scp, whatever) and the db (mysqldump).
what we usually do is have a hierarchy of independent subversion repos as follows:
core
sites/all/modules/contributed
sites/all/modules/custom
sites/all/themes/ (we develop our own and don't use contributed themes)
sites/all/libraries
then we use the svn:externals properties so that if you check out "core" you get every associated repo.
we got about 2 main developers with 4 other guys that may also contribute code to the site. each have their own local dev environment and we all got a common sandbox - where we make sure the stuff we wrote doesn't break someone else's module (it has happened before!).
we use svn commit hooks to update the beta/staging/sandbox site upon commit.
with all that setup, [re]deploying a site is a simple matter of going to the proper folder and issuing a "svn co http://repolocation/reponame ." and then updating the DB.
two last things to consider:
we are moving from svn to git
the features module will allow you to save changes you make to your own modules (views, content types, etc) and package all that into a deployable module so you don't have to duplicate your efforts. we are also looking into using this for ourselves.
I hope this helps you.
I second using backup_migrate. It's great.
When I'm installing a fresh site from development to production, I:
backup the site using backup_migrate module
copy all the files up to the server
edit the sites/default/settings.php to have the right database path and account info
do an import of the last backup_migrate dump (usually using mysql < backupfilename.sql, unless I already have drupal setup and have backup_migrate installed, then I use the GUI
But take a look here for the official version:
http://drupal.org/node/776864
Now, you didn't ask, but when the site is live and users are contributing content, moving future development versions of your site from development/staging to production without blowing away live content is a whole different problem, and one that Drupal doesn't have a good answer for...
Andy-
As part of my overall development practices review I'm looking at how best to streamline and automate our ASP.net web development practices.
At the moment, our process goes something like this:
Designer builds frontend as static HTML/CSS on a network share. This gets tweaked until signed off. (e.g. http://myserver/acmesite_design)
Once signed off, developer takes over and copies over frontend HTML/CSS to a new directory on the same server (e.g. http://myserver/acmesite_development)
Multiple developers work on local copy until project is complete.
Developer publishes code to an external publicly accessible server for a client to review/signoff.
Edits made locally based on feedback.
Republish to external server.
Signoff
Developer publishes to live public server
What goes wrong? Lots of things!
Version Control — this is obviously a must and is being introduced
Configuration errors — many many times, there are environment specific paths and variables (such as DB names, image upload directories, web server paths etc. etc.) which incorrectly get copied from local to staging to live etc. etc. with very embarrassing results.
I'm pretty confident I've got no.1 under control. What about configuration management? Does anyone have any advice as to how best to manage an applications structure within asp.net apps to minimize these kinds of problems?
I found that using SVN, NAnt and NUnit with Cruise Control.net solves a lot of the issues you describe. I think it works well for small groups and it's all free. Just need to learn how to use them.
CruiseControl.net helps you put together builds and continuous integration.
Use NAnt or MSBuild to do different environment builds (DEV, TEST, PROD, etc).
http://confluence.public.thoughtworks.org/display/CCNET/Welcome+to+CruiseControl.NET
You got the most important part right. Use version control. Subversion is a good choice.
I usually store configuration along with the site; i.e. when coding a PHP-based site I have a file named config.php-dist. If you want the site to work at all you'll have to copy + edit in all the required parameters (this avoids storing passwords in version control). The -dist file should have reasonable defaults.
Upload directories should be relative if possible; actually all directories should be relative. I'm not experienced in ASP.net, but if it's anything like PHP the current directory is always the directory of the file being requested. If you channel all requests through a single file (i.e. index.asp), then this can even be found programmatically. Or you could find it programmatically by using the equivalent of dirname(____FILE____) in your configuration file.
I also recommend installing IIS (or whatever webserver you are using) on all development workstations (including the designers). Makes life easier as noone can step on each others toes. What one has to do is simply add test hosts to the hosts file (\windows\system32\drivers\etc\hosts iirc) in addition to adding a site to the local IIS. This plays well with version control (checkout, add site to IIS and hosts-file, edit edit edit commit).
One thing that really helps is making sure you keep your paths relative where you can and centralise them where you can't, so when I've been working with ASP.Net I have tended to use web.config to store any configuration and path related data that can't be found programmatically. It is quite possible to find information like your current application path programmatically through the Request object - it's worth looking in some detail over what the environment makes available to you.
One way to make sure you don't end up on something that is dependent on the path name is having a continuous integration server executing your test suite against your application. Each time this happens you create a random filepath. As soon as someone introduces a dependency on the filepath it will fail.