Categories
GNU/Linux Free Software & Open Source Programming & Web Development

My Python & Django deployment workflow and tools

Yellow python

If you’re new to Python web development with Django, there are some things that tutorials don’t teach. Deploying to a server when you finish your local development can be a frustrating task. Here is a rundown of the tools and workflow I use for deploying a Django based website.

A few days ago when I launched the new version of Notasbit.com, I ended up having a discussion with a friend about deployment methods for Ruby on Rails and Django websites. My friend is used to Rails development and deployment, something I am not familiar with since Rails 1.8 (looong time ago by now) and he insisted that Ruby Gems method is way easier while Python deployment was a hassle. Yes, for people not familiar with the right tools it can be. There are different versions of Python and library dependencies and versions that can make maintaining a Django website a nightmare.

Here is what I use for deploying Django websites on a live server:

The tools

Git

You cannot be serious about programming in general these days if you’re not using a version control system. In Linus Torvald’s words: “if you’re not using Git, you’re an idiot”. I keep all my projects under version control, even if I’m the only developer on it, even when the code is for personal use only. It not only is a good practice to keep, but it also works as your backup. When I got robbed from my computers, I didn’t loose any project data because it was all backed up on several Git repositories at different places. For deployment tasks, you need to install Git on your server. The server you’re deploying to will serve as a backup as well.

Virtualenv

All that version and dependency hell can be isolated using Python’s Virtualenv tool. This is a tool that will create an encapsulated environment of a given python version and all the libraries you want to use in a project without messing with the system’s main versions. So this way, you can have an old Django 1.4 project running on the same server than another project running a later Django 1.8 or 1.9 version, as well as their respective dependencies.

Fabric

The only reason I still haven’t started using Python 3 on my projects is Fabric. This tool will automate all the project maintenance tasks needed, and that includes doing the deployment. It is similar to a Makefile but you write your tasks in Python code so you don’t need to learn a whole new syntax.

Now to the implementation details…

On your local development machine the project structure will look like this:

Project directory structure. The idea is to have the virtualenv outside the version controlled directory. You can also use virtualenvwrapper and have all your virtualenvs in a separate location. I like to have them contained in a same folder for now.

mainproject.com/
 |
 +-- env/ (virtualenv)
 |
 +-- django_project/
     |
     --- manage.py
     --- .git/
     --- fabfile.py
     --- deploy_tools/
     --- All other django project files & dirs

Deployment scripts

In the directory deploy_tools we’ll place the files necessary to configure apache, nginx, gunicorn, uwsgi or whatever other server configuration scripts needed for deployment.

Here’s an example of the Nginx configuration script I use:

upstream mywebsite {  #the upstream component nginx needs to connect to
    server unix:///tmp/mywebsite.sock; # for a file socket
}

server {    
    listen      80;   # the port your site will be served on
    server_name www.mywebsite.com;   # the domain name it will serve for
    charset     utf-8;

    client_max_body_size 250M;   # max upload size, adjust to taste

    location /static {
        alias /var/www/mywebsite/static; # your Django project's static files - amend as required
    }

    location / {     # Finally, send all non-media requests to the Django server.
        uwsgi_pass  mywebsite;
        uwsgi_param QUERY_STRING $query_string;
        uwsgi_param REQUEST_METHOD $request_method;
        uwsgi_param CONTENT_TYPE $content_type;
        uwsgi_param CONTENT_LENGTH $content_length;

        uwsgi_param REQUEST_URI $request_uri;
        uwsgi_param PATH_INFO $document_uri;
        uwsgi_param DOCUMENT_ROOT $document_root;
        uwsgi_param SERVER_PROTOCOL $server_protocol;
        uwsgi_param HTTPS $https if_not_empty;

        uwsgi_param REMOTE_ADDR $remote_addr;
        uwsgi_param REMOTE_PORT $remote_port;
        uwsgi_param SERVER_PORT $server_port;
        uwsgi_param SERVER_NAME $server_name;
    }
}

Settings handling

There are many ways to solve the problem about settings file management in Django applications. I like to have a general settings file and a local one for the different environments. To achieve this, add a local_settings.py file and add it to the git ignore list.

Then add the following at the end of your general settings.py file:

# Import local settings
try:
    from local_settings import *
except ImportError:
    pass

In your remote server, create a local_settings.py file and add DEBUG = False or override any other setting you need specifically for that server. You can have different settings for local, staging, testing or production or any other servers you need to have.

Fabric tasks

For example, you might need to download a copy of your production database to use for development tests. Instead of typing the same mysqldump command every time, you can automate it like this:

def backup_db():
    """
    Gets a database dump from remote server
    """
    production1()
    date = time.strftime('%Y-%m-%d-%H%M%S')
    dbname = 'MY-DATABASE-NAME'
    path = os.path.join(os.path.dirname(__file__), 'db_backups')
    fname = "{dbname}_backup_{date}.sql.gz".format(date=date,dbname=dbname)

    run("mysqldump -u {dbuser} -p'{password}' --add-drop-table -B {database} | gzip -9 > {filename}".format(
        database=dbname,
        dbuser='MY-DB-USER',
        password='MY-SERVER-PASSWORD',
        filename=os.path.join('/tmp', fname))
    )
    get(remote_path=os.path.join('/tmp', fname),
        local_path=os.path.join(path, fname))
    run("rm {filename}".format(filename=os.path.join('/tmp', fname)))

Likewise, you can create a deploy command to get everything in the server. Here’s a fabfile.py example:

from fabric.api import *
from fabric.colors import green, red
import os
import sys
import time
from fabric.contrib import django
import datetime

sys.path.append(os.path.join(os.path.dirname(__file__),
                             'mysite'))

django.settings_module('mysite.settings')
from django.conf import settings

# Hosts
production = '[email protected]'

# Branch to pull from
env.branch = 'master'

@production
def deploy():
    """Uploads files and runs deployment actions to a given server"""
    # path to the directory on the server where your vhost is set up
    path = "/var/www/mysite"
    # name of the application process
    process = "uwsgi"

    print green("Beginning Deploy:")
    with cd(path):
        run("pwd")
        print green("Pulling master from Git server...")
        run("git pull origin %s" % env.branch)
        # use server's virtualenv for commands
        with prefix("source %s/env/bin/activate" % path):
            print green("Installing requirements...")
            run("pip install -r requirements.txt")
            print green("Collecting static files...")
            run("python mysite/manage.py collectstatic --noinput")
            print green("Migrating the database...")
            run("python mysite/manage.py migrate")
        print green("Restart the uwsgi process")
        sudo("service %s restart" % process)
    print green("DONE!")

With these tools, all you need to do when deploying new code to the server is to run one fabric command:

fab deploy

And you’re done.

Most of these ideas I took them from the book Test-Driven Development with Django. You can read it online for free to check out more details or parts I didn’t use in this example.

This is not a perfect solution to every case, but I hope this gives you some ideas for the workflow that fits your case. Share in the comments your deployment workflow or any suggestions to improve this one.

Categories
Programming & Web Development

Using Django models in external Python scripts

Since Django 1.4 the syntax to import you models in external scripts has changed. The setup_environ method has been deprecated for a while and it no longer exists in Django 1.6.

Assuming you’re using a virtualenv and your script is one directory outside the main Django project directory structure.

my_directory/
  |
  |---> virtualenv/
  |---> my_project/   <--- Django project files
  |---> my_script.py

I use this for Django 1.6 integration in my_script.py:

sys.path.append(os.path.join(os.path.dirname(__file__), ‘my_project’))
os.environ.setdefault(“DJANGO_SETTINGS_MODULE”, “my_project.settings”)
from django.conf import settings

from my_app.models import MyModel

Categories
Programming & Web Development

PHP Debugging with Xdebug

https://i2.wp.com/xdebug.org/images/xdebug.gif?w=580

Web development with PHP can get frustrating without the adequate tools. The most common thing a new web developer does when attempting PHP debugging is to add var_dump() calls to print out the value of a variable at a certain point in the program’s execution, in conjunction with an exit() or die() call.

Doing this will reveal the values of the variable you’re trying to check, but it requires you to add and remove lines in your code, and more often than not, those lines are forgotten to be removed, so they end up as bugs in the code.

A good way to develop your code is with test driven development, using unit tests to verify the output of your methods. But sometimes there are parts of the code that cannot be easily tested. For these cases using a proper step by step debugger is a better tool for the job. Xdebug is a PHP extension that does the job. With Xdebug you can set breakpoints in your code, display the contents of constants like $_POST and $_SESSION, give you a trace of the execution around the breakpoint or uncaught exception, and follow the values of any variables you want. You can also profile your application to detect performance bottlenecks.

There are several ways to install Xdebug on your development system. On a Debian-based system you install it with the following command:

sudo aptitude install php5-xdebug

For other systems and custom installations there is an installation Wizard, following the steps in the page will provide you with the instructions needed for your particular case.

Then check your php.ini file to configure the Xdebug extension to your liking. Check the documentation for all the features and options. Here’s a sample of the options:

xdebug.remote_enable=1
xdebug.remote_handler=dbgp
xdebug.remote_host=127.0.0.1
xdebug.remote_port=9000
xdebug.remote_mode=req
xdebug.idekey=default
xdebug.remote_log="/var/log/xdebug/xdebug.log"
xdebug.show_exception_trace=0
xdebug.show_local_vars=9
xdebug.var_display_max_depth=10
xdebug.show_mem_delta=0
xdebug.trace_format=0
xdebug.profiler_enable=1
xdebug.profiler_output_dir="/tmp/xdebug/"

The option xdebug.show_local_vars will display the number of local variables available at the breakpoint, so no need of several var_dump() calls for each variable you want to see. The xdebug.var_display_max_depth changes how many nested levels of arrays or objects will be displayed, where by default, var_dump() will only display 3 levels down. There are more interesting data display options in the documentation.

Make sure the remote log directory and profiler output directories exist in your system. Restart your web server and you’re good to go.

Your phpinfo() output page should now display a section like this:

file:////home/gabriel/xdebug_info.png

Now, here are some links to setup and use your development environment with Xdebug, depending on what you use.

Hope this helps and happy coding!

Categories
GNU/Linux Free Software & Open Source Programming & Web Development

The importance of Web frameworks in startup development teams

SPNP #52

In a start-up environment, things happen so fast that there’s very little time to polish your tools. Code gets done undocumented, development team members come and go with different programming styles and tools, and the chaos just builds up as the business goes.

At some point, you can’t finish or maintain something because one team member left the company with the knowledge of the code and without documenting. You either have to spend time figuring out what was done and how it works, or rewrite all that unmanageable code. I’ve seen this scenario happen too many times.

Just as the saying goes: “Stand in the shoulders of giants”, using an open source development framework is essential for the following reasons:

– Access to lots of documentation, tutorials, forums, etc
– Proven working code by a large user community
– Everyone knows where to look when a certain change is needed
– Development teams scale faster because new hires or people switching project teams can catch up and integrate to the team quickly
– Benefit from patches and upgrades from outside your development team
– Improve team productivity by not having to spend time writing basic common components from scratch every time

Even when using a closed source framework, you still get most of the benefits. Some companies prefer to build their own tools and frameworks. This approach will work out as long as they stay consistent and everybody agrees to use them in a standard way. But unless you’re addressing a very specific problem or facing a special situation, I recommend you not to reinvent the wheel and use one of the many existing open source development frameworks out there.

Here’s an old video of a chat with Steve Jobs, where he talks on how to improve productivity and saving time and resources in a development team, by eliminating lines of code to write and taking advantage of what is already out there.

Photo: SPNP #52 is Creative Commons by J_P_D on Flickr
Categories
GNU/Linux Free Software & Open Source Programming & Web Development

Easy PHP code test development with SimpleTest unit testing framework

Simpletest, Unit Testing for PHP

Testing is a task every developer has to do eventually on any programming project. You can do it manually after writing all your code to see if it works as you intended, or better yet, before writing your code, using test driven development techniques that will save you time and frustrations down the road. One of those techniques is called Unit Testing.

Ideally, unit testing takes place before writing any code. Its a way to plan out how your functions, classes and objects will behave. The basic idea is that you plan the outcomes of each part of your program, then you write those parts to provide that outcome. So think of it as a black box that you will give some input and expect some output. This way you know how objects will be organized, what methods to write in each class, what needs to be their input and how are they going to output the results. Having all your code in tests will help you add new features, change or refactor parts of your code faster without the fear of breaking something else in your software. If you changed something and all your tests still pass, you’re good to go, if not, you can easily track down where the error is and what got affected.

PHP is famous for having spaghetti code and being too flexible that you can mess things out pretty fast, specially when you’re not an experienced programmer, or you have a team of developers with different programming styles and experience working on several components of the same project.

Although there are several unit testing frameworks for PHP, I’ll talk about SimpleTest since its very light, easy to install and easy to learn. Also, if you’re a Drupal developer, its the testing framework of choice.

Setup

To install it, go to the Simpletest download page and get the latest version. Extract the files to your PHP path, or use require_once() calls to the autorun file:

Usage

Here’s an example of a class’ tests (testfile.php):

getList(5);

/* test result is an array */
$this->assertTrue(is_array($result));

/* test correct number of output elements */
$this->assertEqual(count($result), 5);

/* make a different call with different parameter */
$result = $obj->getList();

/* confirm that output is always an array, even with 0 elements */
$this->assertTrue(is_array($result));
}
}

Now you have a basic skeleton of you methods and their behaviors, now you can go ahead and write the real code (myclass.php).

Run it

Once we’ve written the class, we can test out its methods by running our tests script. You can do that by using the command line, or by viewing the php script in a browser window.

Personally, I prefer to use the command line for convenience and faster testing without needing even to setup any web server. Running it will look like this:

$ php testfile.php
testfile.php
OK
Test cases run: 1/1, Passes: 3, Failures: 0, Exceptions: 0

Do more

This is a very small and simple test, but you can also test out the interface of your project. SimpleTest has a browser class that can perform some events like simulating browser visits, doing clicks on your interface’s links and checking the HTML structure of the web server responses.

Check out the full documentation, which is very easily laid out as a big tutorial, and start testing all your code!

Categories
GNU/Linux Free Software & Open Source Programming & Web Development

command line tools for web developers

Computer Data Output

Many people are typically afraid of the terminal. Yes, it might look scary for some, retro for others, but for the practical busy programmer, the terminal is the best tool you can have.

Lately for my day job, I’ve been required to work with lots of static web pages, as I’ve mentioned on several of my previous posts. So for my daily tasks, I’ve been using a lot of command line tools on the terminal that make my work a lot faster.

Here are some of the tools that I’ve been using and how I’ve used them:

  • find

    Helps me list and filter certain types of files for processing. For example find . -name *.html This will give me a list of all files with .html extension under the current directory and subdirectories.

  • sed

    GNU sed is very handy to do all kinds of text manipulation without having to write a whole script about it. For example one common task would be search and replace a text or regular expression pattern on a file. Example: sed -e "s/My Search/My replace/g" myfile.html

  • xargs

    This is a ‘piping’ command, it will take the output of one tool and place it as arguments for the subsequent tool in the line. Example: find . -name *.temp -print0 | xargs sed -n -e "s/Hello/Goodbye/g" This will find all .temp files, then on each of them will search the word “Hello” and will replace it with the word “Goodbye”.

  • tidy

    When you have a bunch of legacy HTML code or “messy” (X)HTML documents you must parse, a good idea is to first clean up the code before working with it. Tidy is a command line tool that will help you with the task of cleaning, reformatting and indenting any messy (X)HTML document. It even does a good job cleaning MS Word generated HTML files!

  • GNU make

    This is an “old school” tool, for the ones that grew up with web development and away from C/C++ development. Make is used to automate certain tasks and in a given order, checking for dependancies. In the web development process, I use make to automate repetitive tasks, such as deploying files to the testing server, making a tag in my version control system and publishing the site on the production server, cleaning up temporary files, and so on. So I write a Makefile with these tasks, and every time I have to upload my code to the testing server I only execute something like make upload and it will do the task. For example, cleaning up all temporary files on my project would involve me doing: find . -name *.temp | xargs rm -rf. I can create a Makefile with the following:

    clean:
    find . -name *.temp | xargs rm -rf

    then every time I need to cleanup my codebase, I simply run make clean Hope you get the idea 😉

  • git

    My preferred version control system for the past 4 years has been Git. Its a distributed version control system that is very simple and very practical to use because its extremely fast and doesn’t get in your way while programming. It has lots of features and tools for the everyday tasks and its a very good practice to version control *all* your projects, even if you’re the only developer of them. Since its distributed, you don’t need to setup a server for it and you can replicate your repository on any media and with as many copies you like. Version controlling your projects will save you from troubles like accidentally deleting files, or, using local code branches, you can easily experiment new features without affecting your main “stable” version of your code. There’s a lot to say about version controlling and Git and I guess I haven’t written about it before (strange since its a big topic for me), so I guess I’ll put more of these details for another post. Just take my advice, use git and version control all your projects. You’ll thank me later.

  • rsync

    Rsync is a great tool to synchronize files and directories from one location to another. This can be on the same machine or on different (remote) machines. The typicall use of rsync is for automated backups. You can use it as so, or you can also use it to mirror your website on another folder or machine. I use it to deploy my files on the testing and/or production servers. This way I don’t have to be worried about forgetting to upload a file, the whole project can be synchronized with one single command on multiple machines. You can configure rsync to connect through ssh (more on this below) to move your files around in a secure, encrypted file transfer.

  • ssh & scp

    You definately don’t want your files to be going through the network in plain sight. I know, some might say: “who cares?” but really, its better to be paranoid and careful about your data. You never know. So the best way to transport your files from one machine to another is through a secure encrypted channel. This is what SSH does for you. With ssh you can connect securely to your server’s command line to execute command there, or you can securely copy files from one machine to another using scp.

There might be several other tools that I use daily but these are the ones more present in my mind as I’ve been using them a constantly.

What command line tools do you use for your web development tasks? Do you have other ideas on which the tools listed here can be used? Send me your comments, this might get interesting and useful for all of us.