Quantcast
Channel: Planet Python
Viewing all articles
Browse latest Browse all 22462

Dusty Phillips: Alternatives to Single Function Classes

$
0
0

In Stop Writing Classes, a talk every Python engineer should watch, Jack Diederich tells us to avoid classes that have a single method, such as:

class ExampleCommand:
    def __init__(self, name, description):
        self.name = name
        self.description = description

    def execute(self):
        print(self.name, self.description)

Jack suggests, rightly, that you can just define and call a function like this:

def example_command(name, description):
    print(name, description)

Sometimes, however, your function needs to accept a specific set of arguments. For example, perhaps you have some kind of executor that calls the execute method on a Command object (or any object ducktyped to look like a Command object) (specifically, one that has an execute method).

Idiomatic python suggests that such an interface should just be a callable. That way you can easily pass first class functions into the executor. As a concrete example, let’s set up a very simple executor that loops over a list of commands and calls them. Bear in mind that a more complicated executor might use more arguments, might run the code in a separate thread or process, or might sit and wait for user input before defining the command. But a basic executor might look like this:

def execute_all(commands):
    for command in commands:
        command()

So, any function that takes no parameters can be a command that would run in this executor:

def hello_command():
    print("hello world")

Note that your executor could define a different interface; for example functions might take three parameters or arbitrary *args and **kwargs, depending on the application. The point is that you can still pass first class functions around to fulfill this pattern. All you have to do is define the duck-typed requirements for the function signature.

However, after a certain amount of time you’ll find that you need commands that maintain a bit of state. Our opening example is such a command. If we rewrite it to define a __call__ method instead of execute, we can still use the callable pattern:

class ExampleCommand:
    def __init__(self, name, description):
        self.name = name
        self.description = description

    def __call__(self):
        print(self.name, self.description)

example_command = ExampleCommand("A Name", "Something about it")
example_command()

This pattern is more useful if your state is likely to change over time. For example, if you were writing a text or image editor and you wanted to have an undo command, you might need to record the details of the most recent change in the command object.

The Callable interface is fulfilled by any object that implements the __call__ method. This conveniently includes function objects, which allows us to define “simple” command objects from functions when no state is required, while more complex command objects instantiated from class can be used when additional instance variables need to be managed.

However, we could often make use of something between these two extremes. The full blown class above has a lot of lines of boilerplate considering how little it is doing. Classes are required when the state might be dynamically updated for some reason. However, we often encounter cases where we do not need to change state, but do need to store it until it is called later. The Python standard library includes a partial function that can take care of this for us:

from functools import partial

def example(name, description):
    print(name, description)

example_command = partial(example, "name", "description")
another_example_command = partial(example, "another_name", "another_description")

example_command()
another_example_command()

The example_command created by partial is a Callable object that can be called as example_command(). That means it can be run from inside the executor defined above. The partial command essentially allows parameters to be “baked” into a separate function call. In this case, the name and description are stored and passed into example whenever it is called. If you were trying to define a different interface that takes parameters in the callable, you’re still in luck: partial proxies arguments or keyword arguments through to the underlying function when it is called. For example:

def with_arguments(name, description, required_arg):
    print(name, description, required_arg)

argument_command = partial(with_arguments, "name", "description")

argument_command("the required arg")

partial is particularly useful if you need to make an existing function that requires arguments behave like a command that does not require arguments. However, it can also be used, as shown above, to store fixed state that gets passed through to the underlying function.

For many intermediate Python coders, there is nothing new in the above introduction. The fact that functions are first class objects, that any object can be made callable, and that partial can be used to bake parameters into a function call is well documented in a variety of idiomatic python references. Now, I’d like to highlight another way to achieve this effect: using inner functions as factories. Inner functions are not a (remotely) new concept, but I have not seen them used widely to mimic “lightweight classes”. Here’s our example once again:

def ExampleCommand(name, description):
    def example():
        print(name, description)
    return example

example_command = ExampleCommand("name", "description")
another_example_command = ExampleCommand("another_name", "another_description")

If you’ve ever written a Python decorator, this pattern should look quite familiar. The callable (a function) returned by ExampleCommand behaves very similarly to an object instantiated from the earlier class that defined a __call__ method. However, it is a little cleaner because it doesn’t require the repetitive self.varname = varname pattern.

This implementation is more flexible than using partial, as you have the opportunity to manipulate the arguments in the outer function before ultimately returning the inner function. Since the addition of the nonlocal keyword in Python 3, this concept is even more extensile. nonlocal allows us to manipulate the variables in the outer function from inside calls to the inner function. Here’s an example:

def CounterCommand(name, count=0):
    def command():
        nonlocal count
        count += 1
        print(name, count)
    return command

If you want to manipulate state from outside the function call, you could go even further and define a coroutine. I’m not going to explain how to do that, though, because every time (with one exception, which is covered in Python 3 Object Oriented Programming) I’ve tried to solve a problem by using coroutines, my solution looked better when I eventually reduced it to more normal function calls or classes. Also the coroutine has to be extensively documented in order to explain how it works to my future self or anyone else reading my code. This further bloats the code and it’s usually not worth it. It is generally much harder to write simple, readable, maintainable code than to write clever code that shows off what you know. Challenge yourself to do the former.

So which of these patterns should you choose? When defining anything resembling a callback or command pattern, design your API so that you can support all five of them. This means that your executor object should take any callable object. You are welcome to impose restrictions on what arguments and keyword arguments the callable can accept, but make sure that the executor will accept any callable that has the correct function signature.

Then for each callback or command you need to implement, I suggest using the simplest one that is flexible enough to do the job. For the most part, the simpler the solution, the fewer lines of code. Here are the options summarized in increasing order of flexibility:

  1. first class functions can maintain no state and must fulfill the exact interface required by the executor. They have no overhead in terms of lines of code.
  2. partial functions can maintain state that doesn’t change between calls and cannot manipulate that state before defining the partial. It can be used to adapt to the interface required by the executor. It has minimal overhead to import and define the partial function. It can potentially reduce the number of lines of code if you are able to reuse a function by passing different baked parameters rather than defining a separate function for each command
  3. inner functions can maintain state that doesn’t change between or within calls. The state can be modified once, before defining the function, but not afterwards. It requires a bit of overhead to define and return the inner function.
  4. inner functions with nonlocals can maintain state that changes within each call, but not between calls. They require a bit more overhead due to the introduction of the nonlocal inside the inner function keyword.
  5. Callable classes can do anything. They can maintain state that is changed during or after command invocations. They require a lot of boilerplate and frequently require dummy __init__ that do nothing more than set up instance variables from parameters.

In the end, however, you should always choose the pattern that you, personally, and the team you work with feel is most readable.


Viewing all articles
Browse latest Browse all 22462

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>