Hi,
What are some good ways of streamlining updates to a website? For example, say I have a software project in version control and I want to release a new version. My current process is:
1. Build an archive containing the new software. (software-1.1.zip)
2. SCP this file to my server.
3. SSH into my server.
4. Use sudo to copy the file into the correct location (Since the website is running as the apache user and my user doesn't have permission to write to that location).
5. Use sudo to update a symlink which always points to the latest version of the software (i.e. Remove 'software.zip' which points to software-1.0.zip and then create a link to software-1.1.zip)
6. Use sudo to change the ownership of the new files to the correct owner (apache).
7. Update the web page with the information about the newest version.
This is fairly tedious and I'l like to automate at least some of it. The most annoying part is the permissions on the web directory. What are the recommended options? Here are some of my ideas:
1. Permit remote access using SSH keys as the apache user. Then I can copy the file into the correct location immediately. Then update the link easily. Or I could allow my user to run as apache using sudo.
2. Have some sort of git post-receive hook:
http://stackoverflow.com/questions/3838 ... te-staging3. Change the group ownership on /var/www/......... to a group which has both my user and apache as members.
4. Have a cronjob running as apache which looks for files in a directory owned by my user.
Anyone have any good suggestions? Ideally, I'd like to get to the situation where I can run one deploy script on my workstation and it'll deploy the new software release.
Any suggestions/recommendations?
Thanks,
Dan