paf

python automation framework

View on GitHub

The PAF framework

Table of content


What is PAF?

PAF stands for the “Python automation framework.” It is a rather tiny tool for creating and running automation scenarios.


External dependencies

PAF framework relies on python3.

On top of that, it requires the following set of external Python modules:

Module link Version
coloredlogs 15.0.1+
paramiko 3.0.0+

The same list of dependencies can be found here. Check the packages section.

To install the dependencies, you can choose one of the following ways:

That should be enough to proceed.


PAF workflow

  1. Place the root paf folder, which contains the framework’s implementation, anywhere you want
  2. Create a folder inside the root “./paf” folder containing your developed automation scripts. Let’s call it “my_scenarios”.
  3. Create any number of “*.py” files inside “./paf/my_scenarios/” folder, which will contain your automation scripts. You can create any number of Python modules.
  4. Develop automation scripts using the “./paf/paf/paf_impl.py” module. Import it into your scripts and develop the tasks. More information on how to do it is located here
  5. Create an XML file with any name in any location. This file will contain parameters and declarations of scenarios and phases. Let’s call it scenarios.xml and place it inside the “./paf/my_scenarios/scenarios.xml” folder.
  6. Fill the file with the content. More information on how to do it is located here
  7. Once both Python code and XML configuration are ready, start the paf tool, feeding your developed artifacts to it. More information on how to do it is located here

How do I declare a task?

A task is an essential building block consisting of one or more commands. You can execute a single task. Besides that, you can group one or more tasks inside a phase and one or more phases inside a scenario. More information on how to do it is located here.

But, as of now, let’s go back to the task. Here is an example of the dummy task, which will call an echo console command:

from paf.paf_impl import logger
from paf.paf_impl import SSHLocalClient

class echo_test(SSHLocalClient):

    def __init__(self):
        super().__init__()
        self.set_name(echo_test.__name__)

    def execute(self):
        output = self.ssh_command_must_succeed("echo ${ECHO_PHRASE}")
        if self.ECHO_PHRASE != output :
            raise Exception(f"Actual output '{output}' is not equal to the expected output '{self.ECHO_PHRASE}'")
        else:
            logger.info(f"Assertion passed. Actual output '{output}' is equal to the expected output '{self.ECHO_PHRASE}'")

As you can see, your task is a simple Python class, which inherits from the framework’s base classes. The requirements for such a class are relatively simple:

That’s all. That simple.

The commands inside your task might be executed using one of the following supported principles:

  1. via creating a sub-process on the machine, which is executing the PAF scenario
  2. via SSH connection from the machine, which is executing the PAF scenario to one or more target machines

Note! If you use SSH communication, make sure that the target machine has an established SSH server.

The API of the paf_impl.The Task class, which the developer of the scenarios should use, consists of the following methods:



PAF framework allows you to globally set default values of some of the API’s parameters:

If your automation requires the alternative default parameters that are specified in the API description, you can set your alternative ones using the following methods:

The most appropriate way to use this feature is to pack these calls into the init section of your Base task:

class SomeBaseTask(Task):
    def __init__(self):
        super().__init__()

    def init(self):
        Config.set_default_execution_mode(ExecutionMode.COLLECT_DATA)
        Config.set_default_interaction_mode(InteractionMode.IGNORE_INPUT)
        Config.set_default_communication_mode(CommunicationMode.PIPE_OUTPUT)

That will prevent you from specifying those parameters in each PAF API call.


The content of the XML configuration file

The PAF framework is configured using XML files of a specific format. You can feed any number of the XML configuration files to the PAF framework, which it will consider during the execution phase. Use the “-c” “–config” parameter to specify a single path to the configuration file:

python ./paf/paf_main.py -imd ./paf/my_scenarios -c ./paf/my_scenarios/scenarios.xml -s echo_test

Multiple occurrences of the parameter will be considered as various files, which should be applied during the execution. Configuration files are parsed in order of their occurrence in the command line.

Here is what would be the content of the XML file to write our “echo_test” task:

<paf_config>

    <!--Predefined PAF parameters for connection to the local system-->
    <param name="LOCAL_HOST_IP_ADDRESS" value="127.0.0.1"/>
    <param name="LOCAL_HOST_USER_NAME" value="vladyslav_goncharuk"/>
    <param name="LOCAL_HOST_SYSTEM_SSH_KEY" value="/home/vladyslav_goncharuk/.ssh/id_rsa"/>
    <!-- Better not to use password parameter. Use RSA instead. Still, it is supported. -->
    <!-- <param name="LOCAL_HOST_SYSTEM_PASSWORD" value="xyz"/> -->

    <param name="ECHO_PHRASE" value="Hello, world!"/>
    <phase name="echo_test">
        <task name="my_scenarios.my_scenarios.echo_test"/>
    </phase>
    <scenario name="echo_test">
        <phase name="echo_test"/>
    </scenario>
</paf_config>

The following tags are currently supported:

The “param” tag specifies the “key-value” pair that can be accessed from within the task. In our case, we have a single parameter called ECHO_PHRASE. You can definitely declare a lot more of them.

You can access your parameters in multiple ways from the code of your tasks:

* Via accessing it as a field:
  ```python
  self.ECHO_PHRASE
  ```
  That is possible as all tasks will have all parameters injected into their 'self.\_\_dict\_\_' collection.

* Via using the bash-like parameters substitution inside the PAF's API:
  ```python
  output = self.ssh_command_must_succeed("echo ${ECHO_PHRASE}")
  ```
  That is possible because of the 'string.Template' abstraction from Python core libraries that we use inside the framework's implementation.

As you can see above, the “param” tag supports two following mandatory attributes:


How to execute PAF scenarios?

Currently, the tool supports the following set of command line options:

Parameter Comment Allowed number of occurrences
-t, –task Task to be executed Multiple
-s, –scenario Scenario to be executed Multiple
-ph, –phase Phase to be executed Multiple
-c, –config Apply this XML configuration file Multiple
-p, –parameter Add parameter to the execution context Multiple
-imd, –import_module_dir Load all Python modules from the specified directory recursively. It also adds specified directories to the sys.path Multiple
-ld, –log-dir Store the output to the specified directory Last win

The typical command to execute the PAF scenario would be:

python ./paf/paf_main.py -imd ./paf/my_scenarios -c ./paf/my_scenarios/scenarios.xml -s echo_test -p ECHO_PHRASE="Overriden echo phrase!" -ld="./"

More examples

You can find the example automation project on the following page


FAQ

Q: Does PAF support interactive input?

A: PAF supports interactive input for both subprocesses and SSH commands. Please consider the following nuance for the SSH commands: By default, each call of the “ssh_command_must_succeed” and “exec_ssh_command” methods will be interactive. But between the SSH calls, the channel will be re-opened, which would cause certain limitations, e.g., environment variables are not saved, or the sudo password is not remembered.

Note! You can change the default behavior of handling user input by specifying the ‘InteractionMode.IGNORE_INPUT’ value for the ‘interaction_mode’ parameter.

Q: Why is the sudo password asked on each SSH command execution? Is the connection re-opened on each command?

A: No, the connection is not re-opened on each call. So, there are no multiple attempts to log in to your machine. Still, the channel is re-opened on each command, which causes the “sudo password not remembered” use case. It is a limitation of SSH. To reliably get the exit code of each command, we need to use the exec mode of SSH communication. In that mode, the channel is closed after the command execution.

Q: Does the framework use any predefined parameters?

A: Yes, there are a couple of such parameters:

Parameter Comment
LOCAL_HOST_IP_ADDRESS IP address of the local system. It is 127.0.0.1. Still, you might need to run your scenarios on another machine. Due to that, it is possible to configure it.
LOCAL_HOST_USER_NAME User name to be used for authentication to the local system
LOCAL_HOST_SYSTEM_SSH_KEY The full path to the private SSH key, which will be used for authentication to the local system
LOCAL_HOST_SYSTEM_PASSWORD The password used for authentication to the local system. It is better not to use this parameter and prefer the usage of the SSH key

Note! The above-mentioned local host parameters are used by the paf_impl.SSHLocalClient class. If you are not using it - you can avoid specifying them.


PAF dependencies

The diagram below contains the dependencies of the PAF framework on the other Python modules:

alt text


PAF execution workflow

The below diagram contains a high-level description of the PAF’s execution flow:

alt text