What can Ansible do that Fabric can't

Server management with Python and Fabric

The two tools Chef and Puppet [1] are ideally suited to administering large server landscapes, but they require considerable configuration effort. If you want to regularly execute certain commands on multiple servers, you should alternatively take a look at Fabric [2], an impressive Python module that manages such tasks as flexible cooking recipes.

Turning daily tasks into fabric tasks is a great help for anyone who works with different systems. These tasks can be performed with the "" command. Originally Fabric only had a few simple features, but over the years it has grown so much that in some cases it can replace Chef and Puppet.

Relies on SSH

If an SSH configuration already exists to log in to another computer, Fabric can make use of it. Of course, Fabric also works locally. Fabric can include or exclude individual hosts from specific tasks, either directly by their names or through assigned roles. Fabric can execute tasks in parallel or one after the other. For better control, Fabric always shows what it is doing if required - this helps, for example, with troubleshooting. Fabric has simple capabilities for managing files in the »« module. Fabric is easy to install using "":

pip install fabric

If you don't have Pip installed, you can try Easy-Install instead:

easy_install fabric

Alternatively, the installation from the source code usually works without any major problems.

If you call "", the tool looks for the file "" in the current directory or moves up the directory hierarchy until it finds such a file. In this way it is easy to manage different tasks in different directories. The task file can also be specified directly with the »« switch.

A first simple fabric task looks something like this:

from fabric import * def uptime (): run (uptime)

It executes the uptime command on every computer that is listed on the command line when it is called, for example:

fab -H localhost uptime

A list of all available fabric tasks shows "", but these are the so-called "old style" tasks, which include all defined functions. It is better to use the new style tasks, which only display tasks that can be used and not those that are only used for internal purposes. If you redefine the above task as follows:

def uptime (): run (uptime) @task def get_uptime (): uptime ()

shows »« only »« and ignores the »« function. Listing 1 shows what the result looks like when the task is called with two hosts.

01 [web1.example.com] Executing task 'get_uptime' 02 [web1.example.com] run: uptime 03 11:22 up 17 days, 12:13, 5 users, load averages: 1.03 1.88 1.32 04 [web2.example .com] Executing task 'get_uptime' 05 [web2.example.com] run: uptime 06 11:22 up 12 days, 22:11, 3 users, load averages: 1.43 1.38 1.34

Hosts in the file

Specifying the hosts again and again on the command line is tedious in the long run. As the next example shows, they can also be defined directly in a task. It uses Fabric to deploy a website based on Git. Because the hosts are defined in the file, it is sufficient to execute "" on the command line.

First, the script uses the Python Datetime module to compose a Git tag string that reflects the deployment time.

This task should only be executed once, the decorator "" takes care of that. The "" task executes the "" task and changes to the "" directory to start the Git command there. At the end of the with block, the script automatically switches back to the previous directory and restarts the Apache web server with "". Figure 1 shows the process.

Figure 1: Deploying a web application using fabric.
comments powered by Disqus