Closures are techniques implemented in programming languages that support passing functions as arguments to other functions (i.e., languages with first-class functions). For instance, this is the case for Python (and in general for languages allowing a functional programming style). However, numerical computing references for physicists typically employ imperative (C, Fortran) or object-oriented (C++) programming styles. In practice these patterns often propagates also to programs written in languages allowing in principle a functional style.

First, we shortly recall the difference between local and global variables. Then, a simple example of a closure is illustrated in the Python language focusing on the role of free variables. The Python language is chosen since its syntax is already a good pseudo-code itself, making it easier to understand the closure structure itself. An overview of closures usage in languages commonly employed for scientific computing (especially Python and Julia) is discussed.

On this page:

Global and Local Variables

First, it is useful to recall that local variables, as opposed to global variables, are bound to a local scope. For instance, if a variable is defined only within the body of a function then it cannot be accessed outside the function itself. A global variable, instead can be accessed from everywhere in the source code.

In the following example the variable a (whose value is the string "global") is defined as a global one, hence it can be accessed from everywhere in the code. The body of the function foo() retrieves its value and prints it. The same is done in the function bar(), where however a is subsequently re-assigned in the same function body.

a = "global"
def foo():
    """Print global variable."""

def bar():
    """Attempt to print global variable while re-defining it locally."""
    a = "local"

foo()                          # Retrieve the global variable 'a'.
# -> global
bar()                          # Error: 'a' has local scope here.
# ->
# Traceback (most recent call last):
#   File "<stdin>", line 1, in <module>
#   File "<stdin>", line 2, in bar
# UnboundLocalError: local variable 'a' referenced before assignment

Calling foo() prints the string global as expected. However, calling bar() raises an exception. In this second case, we first try to retrieve the global variable a, but in the same function body the variable is re-assigned. To avoid ambiguity, the Python interpreter considers here a as a local variable. An exception is then risen because we try to access its value before that it has been assigned.

Similarly, a new variable b assigned only within a function body has a local scope and it is not possible to access it from outside that scope.

def baz():
    b = "local"

# ->
# Traceback (most recent call last):
#   File "<stdin>", line 1, in <module>
# NameError: name 'b' is not defined

Closures and Free Variables

Let's now focus on the role of functions. If they are implemented in a given language as first-class objects, then they can be passed as arguments to other functions. A function can return another function, too. The following example illustrates a particular structure taking advantage of these possibilities allowed in Python. Below a function is defined such that it returns a nested function defined in its own body. The nested function has access to its own argument, but also to a variable defined in the body of the enclosing function. This design defines a closure.

In the example we aim to first initialize a counter variable to zero, and then increase and print its value at every subsequent call.

def counter():
    """Define and update a counter."""

    # Here the closure starts.
    x = 0
    def add_step(step):
        """Increase counter value by a given step."""
        nonlocal x
        x += step
    # Here the closure ends.

    return add_step

Calling counter() returns the function add_step(), which can then be called by specifying the argument step. The interesting point is that while step is a local variable, the nested function add_step() has also access to the variable x. The latter is not a global variable, being defined within the scope of the enclosing function. It is not local neither, since it is defined outside the nested function that still has access to it. It is called a non-local or free variable.

Thanks to the free variable, calls to counter() carry memory of the environment where the returned nested function was defined. Hence, every subsequent call will take into account the current updated value of the counter x and increase it. In the following example the variable mycounter is attributed to the function returned by counter().

mycounter = counter()
# -> 2
# -> 4
# -> 6

Since mycounter is attributed to the nested add_step() function, it can be called as a function itself specifying the argument step (in this example equal to two). Every subsequent call to mycounter() updates the value of the free variable, increasing it by the specified step.

When defining the closure we had to explicitly declare the free variable as nonlocal x within the nested function. Without this assignment x would be interpreted as a local variable, because in this example the variable x is re-assigned (x += step) within the nested function. This would cause an error because a local variable cannot carry memory of the value it had before the new function call. At each call the nested function would try to attribute x = x + step, where x on the right hand side is not defined yet. However, the nonlocal declaration is an implementation detail proper to Python 3 (note that Python 2 would require a more involved approach, the code above only works for Python 3). For more details about the Python implementation see, e.g., Fluent Python. Other languages may not require such an explicit declaration.

When to Use Closures

Pure functional languages such as Haskell or some Lisp dialect heavily rely on closures, but they are not widely spread in physics. Imperative programming languages such as C or Fortran do not support closures (there are libraries implementing similar patterns, though). Other common languages such as C++ or Python do support closures, but alternative constructs are more idiomatic. For instance, the previous example can be easily written in a object-oriented style as follows.
class Counter:
    def __init__(self):
        self.x = 0

    def add_step(self, step):
        """Increase counter value by a given step."""
        self.x += step

mycounter = Counter()
# -> 2
# -> 4
# -> 6

In this example the closure has been replaced by the Counter class. The counter variable x is initialized when the class is instantiated. The class method add_step() updates the counter by the specified step.

Hence, closures can be used as a more compact implementation of classes containing a single method. On the other hand, the class-based implementation can be more clear (and extensible) even to programmers with an imperative language background only. (Prominent Python developers themselves often discourage resorting heavily on functional programming patterns.)

Closures can also be useful in the case a function needs to be partially evaluated at different steps. This is sometimes the case when a function has to be passed to another one implemented in an external library, if the latter expects less arguments than what the function actually has (see the Julia example below). Still, more appropriate tools are often available. E.g., the Python functool library provides, among the rest, functools.partial() for partial function evaluation. Sometimes it is necessary to rely on the latter option only, as nested functions (needed for closures) may be incompatible with some tool. This is the case for the method for parallel computing in Python, only compatible with non-nested functions.

Useful design patterns such as function decorators (e.g., @classmethod) rely on closures in Python. However, scientific algorithms rarely need to define decorators from scratch. They are usually imported from external libraries and their usage do not require an understanding of the underlying closure algorithm.

Newer promising languages such as Julia (addressed specifically to scientific computing) tend to rely on functional paradigms more than Python or C++. In that case closure patterns may turn out to be more idiomatic. Closures are discussed in the Scope of Variables section of the official Julia documentation. In the following example we integrate the Bessel function besselj(nu,x) of order nu over x. The standard library provides the function quadgk(f, xmin, xmax) for numerical integration of the function f(x) from xmin to xmax. The integrand f(x) must have only one argument (the integration variable), but here we also have the extra parameter nu. The following solution seems a fairly idiomatic implementation for Julia.

function integral(xmin::Real, xmax::Real, nu::Real)
    # Compute the integral of the Bessel function J_nu(x) over x.

    function f(x::Real)
        besselj(nu, x)      # nu is a free variable.

    quadgk(f, xmin, xmax)

println(integral(0.0, 1.0, 0.5))
# -> (0.4951165756309393,5.672530775980319e-9)

Here we extended the quadgk() function simply introducing an enclosing function, integral(), and treating the extra parameter nu as a free variable within a nested function f(x) that, having only one argument, can now be passed to quadgk(). The code above has been tested with Julia version 0.4.7. (Note that being Julia dynamically typed, similarly to Python it is not necessary to specify the variable types. For instance, x::Real could be replaced by x. The difference is in the execution time. The code snippet above took 1.358s to run, while running the same code without any type specification it takes 5.710s.)