Functional Programming Python PDF
Functional Programming Python PDF
Programming
in Python
David Mertz
Additional
Resources
4 Easy Ways to Learn More and Stay Current
Programming Newsletter
Get programming r elated news and content delivered weekly to your inbox.
oreilly.com/programming/newsletter
OReilly Radar
Read more insight and analysis about emerging technologies.
radar.oreilly.com
Conferences
Immerse yourself in learning at an upcoming OReilly conference.
conferences.oreilly.com
2015 OReilly Media, Inc. The OReilly logo is a registered trademark of OReilly Media, Inc. #15305
Functional Programming
in Python
David Mertz
May 2015:
First Edition
First Release
The OReilly logo is a registered trademark of OReilly Media, Inc. Functional Pro
gramming in Python, the cover image, and related trade dress are trademarks of
OReilly Media, Inc.
While the publisher and the author have used good faith efforts to ensure that the
information and instructions contained in this work are accurate, the publisher and
the author disclaim all responsibility for errors or omissions, including without limi
tation responsibility for damages resulting from the use of or reliance on this work.
Use of the information and instructions contained in this work is at your own risk. If
any code samples or other technology this work contains or describes is subject to
open source licenses or the intellectual property rights of others, it is your responsi
bility to ensure that your use thereof complies with such licenses and/or rights.
978-1-491-92856-1
[LSI]
Table of Contents
Preface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
(Avoiding) Flow Control. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Encapsulation
Comprehensions
Recursion
Eliminating Loops
1
2
5
7
Callables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Named Functions and Lambdas
Closures and Callable Instances
Methods of Classes
Multiple Dispatch
12
13
15
19
Lazy Evaluation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
The Iterator Protocol
Module: itertools
27
29
Higher-Order Functions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Utility Higher-Order Functions
The operator Module
The functools Module
Decorators
35
36
36
37
iii
Preface
Preface
Resources
There are a large number of other papers, articles, and books written
about functional programming, in Python and otherwise. The
Python standard documentation itself contains an excellent intro
duction called Functional Programming HOWTO, by Andrew
Kuchling, that discusses some of the motivation for functional pro
gramming styles, as well as particular capabilities in Python.
Preface
vii
A Stylistic Note
As in most programming texts, a fixed font will be used both for
inline and block samples of code, including simple command or
function names. Within code blocks, a notional segment of pseudocode is indicated with a word surrounded by angle brackets (i.e., not
valid Python), such as <code-block>. In other cases, syntactically
valid but undefined functions are used with descriptive names, such
as get_the_data().
viii
Preface
Encapsulation
One obvious way of focusing more on what than how is simply
to refactor code, and to put the data construction in a more isolated
placei.e., in a function or method. For example, consider an exist
ing snippet of imperative code that looks like this:
Comprehensions
Using comprehensions is often a way both to make code more com
pact and to shift our focus from the how to the what. A compre
hension is an expression that uses the same keywords as loop and
conditional blocks, but inverts their order to focus on the data
2
Far more important than simply saving a few characters and lines is
the mental shift enacted by thinking of what collection is, and by
avoiding needing to think about or debug What is the state of col
lection at this point in the loop?
List comprehensions have been in Python the longest, and are in
some ways the simplest. We now also have generator comprehen
sions, set comprehensions, and dict comprehensions available in
Python syntax. As a caveat though, while you can nest comprehen
sions to arbitrary depth, past a fairly simple level they tend to stop
clarifying and start obscuring. For genuinely complex construction
of a data collection, refactoring into functions remains more reada
ble.
Generators
Generator comprehensions have the same syntax as list comprehen
sionsother than that there are no square brackets around them
(but parentheses are needed syntactically in some contexts, in place
of brackets)but they are also lazy. That is to say that they are
merely a description of how to get the data that is not realized
until one explicitly asks for it, either by calling .next() on the
object, or by looping over it. This often saves memory for large
sequences and defers computation until it is actually needed. For
example:
log_lines = (line for line in read_line(huge_log_file)
if complex_condition(line))
Comprehensions
For typical uses, the behavior is the same as if you had constructed a
list, but runtime behavior is nicer. Obviously, this generator compre
hension also has imperative versions, for example:
def get_log_lines(log_file):
line = read_line(log_file)
while True:
try:
if complex_condition(line):
yield line
line = read_line(log_file)
except StopIteration:
raise
log_lines = get_log_lines(huge_log_file)
Yes, the imperative version could be simplified too, but the version
shown is meant to illustrate the behind-the-scenes how of a for
loop over an iteratablemore details we also want to abstract from
in our thinking. In fact, even using yield is somewhat of an abstrac
tion from the underlying iterator protocol. We could do this with a
class that had .__next__() and .__iter__() methods. For example:
class GetLogLines(object):
def __init__(self, log_file):
self.log_file = log_file
self.line = None
def __iter__(self):
return self
def __next__(self):
if self.line is None:
self.line = read_line(log_file)
while not complex_condition(self.line):
self.line = read_line(self.log_file)
return self.line
log_lines = GetLogLines(huge_log_file)
Aside from the digression into the iterator protocol and laziness
more generally, the reader should see that the comprehension focu
ses attention much better on the what, whereas the imperative ver
sionalthough successful as refactorings perhapsretains the focus
on the how.
Recursion
Functional programmers often put weight in expressing flow con
trol through recursion rather than through loops. Done this way, we
can avoid altering the state of any variables or data structures within
an algorithm, and more importantly get more at the what than the
how of a computation. However, in considering using recursive
styles we should distinguish between the cases where recursion is
just iteration by another name and those where a problem can
readily be partitioned into smaller problems, each approached in a
similar way.
There are two reasons why we should make the distinction men
tioned. On the one hand, using recursion effectively as a way of
marching through a sequence of elements is, while possible, really
not Pythonic. It matches the style of other languages like Lisp, def
initely, but it often feels contrived in Python. On the other hand,
Python is simply comparatively slow at recursion, and has a limited
stack depth limit. Yes, you can change this with sys.setrecursion
limit() to more than the default 1000; but if you find yourself
doing so it is probably a mistake. Python lacks an internal feature
called tail call elimination that makes deep recursion computation
ally efficient in some languages. Let us find a trivial example where
recursion is really just a kind of iteration:
def running_sum(numbers, start=0):
if len(numbers) == 0:
print()
return
total = numbers[0] + start
print(total, end=" ")
running_sum(numbers[1:], total)
Recursion
Some names are used in the function body to hold convenient val
ues, but they are never mutated. It would not be as readable, but the
definition could be written as a single expression if we wanted to do
so. In fact, it is somewhat difficult, and certainly less intuitive, to
transform this into a stateful iterative version.
As general advice, it is good practice to look for possibilities of
recursive expressionand especially for versions that avoid the
need for state variables or mutable data collectionswhenever a
problem looks partitionable into smaller problems. It is not a good
idea in Pythonmost of the timeto use recursion merely for iter
ation by other means.
Eliminating Loops
Just for fun, let us take a quick look at how we could take out all
loops from any Python program. Most of the time this is a bad idea,
both for readability and performance, but it is worth looking at how
simple it is to do in a systematic fashion as background to contem
plate those cases where it is actually a good idea.
If we simply call a function inside a for loop, the built-in higherorder function map() comes to our aid:
for e in it:
func(e)
# statement-based loop
# map()-based "loop"
Eliminating Loops
Of course, looking at the example, one suspects the result one really
wants is actually to pass all the arguments to each of the functions
rather than one argument from each list to each function. Express
ing that is difficult without using a list comprehension, but easy
enough using one:
>>> do_all_funcs = lambda fns, *args: [
list(map(fn, *args)) for fn in fns]
>>> _ = do_all_funcs([hello, bye],
['David','Jane'], ['Mertz','Doe'])
Hello David Mertz
Hello Jane Doe
Bye David Mertz
Bye Jane Doe
<suite>
# FP-style recursive while loop
def while_block():
<pre-suite>
if <break_condition>:
return 1
else:
<suite>
return 0
while_FP = lambda: (<cond> and while_block()) or while_FP()
while_FP()
Now lets remove the while loop for the functional version:
# FP version of "echo()"
def identity_print(x):
# "identity with side-effect"
print(x)
return x
echo_FP = lambda: identity_print(input("FP -- "))=='quit' or
echo_FP()
echo_FP()
Eliminating Loops
Eliminating Recursion
As with the simple factorial example given above, sometimes we can
perform recursion without recursion by using func
tools.reduce() or other folding operations (other folds are not in
the Python standard library, but can easily be constructed and/or
occur in third-party libraries). A recursion is often simply a way of
combining something simpler with an accumulated intermediate
result, and that is exactly what reduce() does at heart. A slightly
longer discussion of functools.reduce() occurs in the chapter on
higher-order functions.
10
Callables
11
Any method that accesses the state of an instance (in any degree) to
determine what result to return is not a pure function. Of course, all
the other types of callables we discuss also allow reliance on state in
various ways. The author of this report has long pondered whether
he could use some dark magic within Python explicitly to declare a
function as puresay by decorating it with a hypothetical
@purefunction decorator that would raise an exception if the func
tion can have side effectsbut consensus seems to be that it would
be impossible to guard against every edge case in Pythons internal
machinery.
The advantage of a pure function and side-effect-free code is that it is
generally easier to debug and test. Callables that freely intersperse
statefulness with their returned results cannot be examined inde
pendently of their running context to see how they behave, at least
not entirely so. For example, a unit test (using doctest or unittest,
or some third-party testing framework such as py.test or nose)
might succeed in one context but fail when identical calls are made
within a running, stateful program. Of course, at the very least, any
program that does anything must have some kind of output
(whether to console, a file, a database, over the network, or what
ever) in it to do anything useful, so side effects cannot be entirely
eliminated, only isolated to a degree when thinking in functional
programming terms.
12
Callables
>>> hello2('David')
Hello David
>>> hello1.__qualname__
'hello1'
>>> hello2.__qualname__
'<lambda>'
>>> hello3 = hello2
# can bind func to other names
>>> hello3.__qualname__
'<lambda>'
>>> hello3.__qualname__ = 'hello3'
>>> hello3.__qualname__
'hello3'
One of the reasons that functions are useful is that they isolate state
lexically, and avoid contamination of enclosing namespaces. This is
a limited form of nonmutability in that (by default) nothing you do
within a function will bind state variables outside the function. Of
course, this guarantee is very limited in that both the global and
nonlocal statements explicitly allow state to leak out of a function.
Moreover, many data types are themselves mutable, so if they are
passed into a function that function might change their contents.
Furthermore, doing I/O can also change the state of the world and
hence alter results of functions (e.g., by changing the contents of a
file or a database that is itself read elsewhere).
Notwithstanding all the caveats and limits mentioned above, a pro
grammer who wants to focus on a functional programming style can
intentionally decide to write many functions as pure functions to
allow mathematical and formal reasoning about them. In most
cases, one only leaks state intentionally, and creating a certain subset
of all your functionality as pure functions allows for cleaner code.
They might perhaps be broken up by pure modules, or annotated
in the function names or docstrings.
13
Let us construct a toy example that shows this, something just past a
hello world of the different styles:
# A class that creates callable adder instances
class Adder(object):
def __init__(self, n):
self.n = n
def __call__(self, m):
return self.n + m
add5_i = Adder(5)
# "instance" or "imperative"
# "functional"
So far these seem to amount to pretty much the same thing, but the
mutable state in the instance provides a attractive nuisance:
>>>
15
>>>
15
>>>
>>>
20
add5_i(10)
add5_f(10)
add5_i.n = 10
add5_i(10)
14
| Callables
Methods of Classes
All methods of classes are callables. For the most part, however, call
ing a method of an instance goes against the grain of functional pro
gramming styles. Usually we use methods because we want to refer
ence mutable data that is bundled in the attributes of the instance,
and hence each call to a method may produce a different result that
varies independently of the arguments passed to it.
15
16
Callables
import math
class RightTriangle(object):
"Class used solely as namespace for related functions"
@staticmethod
def hypotenuse(a, b):
return math.sqrt(a**2 + b**2)
@staticmethod
def sin(a, b):
return a / RightTriangle.hypotenuse(a, b)
@staticmethod
def cos(a, b):
return b / RightTriangle.hypotenuse(a, b)
RightTriangle.hypotenuse(3,4)
rt = RightTriangle()
rt.sin(3,4)
rt.cos(3,4)
Methods of Classes
17
TypeError
Traceback (most recent call last)
<ipython-input-5-e1de62cf88af> in <module>()
----> 1 m.product(3,4,5)
<ipython-input-2-535194f57a64> in product(*nums)
2 class Math(object):
3
def product(*nums):
----> 4
return functools.reduce(operator.mul, nums)
5
def power_chain(*nums):
6
return functools.reduce(operator.pow, nums)
TypeError: unsupported operand type(s) for *: 'Math' and 'int'
Generator Functions
A special sort of function in Python is one that contains a yield
statement, which turns it into a generator. What is returned from
calling such a function is not a regular value, but rather an iterator
that produces a sequence of values as you call the next() function
on it or loop over it. This is discussed in more detail in the chapter
entitled Lazy Evaluation.
While like any Python object, there are many ways to introduce
statefulness into a generator, in principle a generator can be pure
in the sense of a pure function. It is merely a pure function that pro
duces a (potentially infinite) sequence of values rather than a single
value, but still based only on the arguments passed into it. Notice,
however, that generator functions typically have a great deal of inter
nal state; it is at the boundaries of call signature and return value
that they act like a side-effect-free black box. A simple example:
>>> def get_primes():
...
"Simple lazy Sieve of Eratosthenes"
...
candidate = 2
...
found = []
...
while True:
...
if all(candidate % prime != 0 for prime in found):
...
yield candidate
...
found.append(candidate)
...
candidate += 1
18
| Callables
...
>>> primes = get_primes()
>>> next(primes), next(primes), next(primes)
(2, 3, 5)
>>> for _, prime in zip(range(10), primes):
...
print(prime, end=" ")
....
7 11 13 17 19 23 29 31 37 41
Every time you create a new object with get_primes() the iterator is
the same infinite lazy sequenceanother example might pass in
some initializing values that affected the resultbut the object itself
is stateful as it is consumed incrementally.
Multiple Dispatch
A very interesting approach to programming multiple paths of exe
cution is a technique called multiple dispatch or sometimes mul
timethods. The idea here is to declare multiple signatures for a sin
gle function and call the actual computation that matches the types
or properties of the calling arguments. This technique often allows
one to avoid or reduce the use of explicitly conditional branching,
and instead substitute the use of more intuitive pattern descriptions
of arguments.
A long time ago, this author wrote a module called multimethods
that was quite flexible in its options for resolving dispatch lineariza
tion but is also so old as only to work with Python 2.x, and was
even written before Python had decorators for more elegant expres
sion of the concept. Matthew Rocklins more recent multipledis
patch is a modern approach for recent Python versions, albeit it
lacks some of the theoretical arcana I explored in my ancient mod
ule. Ideally, in this authors opinion, a future Python version would
include a standardized syntax or API for multiple dispatch (but
more likely the task will always be the domain of third-party libra
ries).
To explain how multiple dispatch can make more readable and less
bug-prone code, let us implement the game of rock/paper/scissors in
three styles. Let us create the classes to play the game for all the ver
sions:
class Thing(object): pass
class Rock(Thing): pass
Multiple Dispatch
19
Many Branches
First a purely imperative version. This is going to have a lot of repet
itive, nested, conditional blocks that are easy to get wrong:
def beats(x, y):
if isinstance(x, Rock):
if isinstance(y, Rock):
return None
# No winner
elif isinstance(y, Paper):
return y
elif isinstance(y, Scissors):
return x
else:
raise TypeError("Unknown second thing")
elif isinstance(x, Paper):
if isinstance(y, Rock):
return x
elif isinstance(y, Paper):
return None
# No winner
elif isinstance(y, Scissors):
return y
else:
raise TypeError("Unknown second thing")
elif isinstance(x, Scissors):
if isinstance(y, Rock):
return y
elif isinstance(y, Paper):
return x
elif isinstance(y, Scissors):
return None
# No winner
else:
raise TypeError("Unknown second thing")
else:
raise TypeError("Unknown first thing")
rock, paper, scissors = Rock(), Paper(), Scissors()
# >>> beats(paper, rock)
# <__main__.Paper at 0x103b96b00>
# >>> beats(paper, 3)
# TypeError: Unknown second thing
Callables
class DuckRock(Rock):
def beats(self, other):
if isinstance(other, Rock):
return None
# No winner
elif isinstance(other, Paper):
return other
elif isinstance(other, Scissors):
return self
else:
raise TypeError("Unknown second thing")
class DuckPaper(Paper):
def beats(self, other):
if isinstance(other, Rock):
return self
elif isinstance(other, Paper):
return None
# No winner
elif isinstance(other, Scissors):
return other
else:
raise TypeError("Unknown second thing")
class DuckScissors(Scissors):
def beats(self, other):
if isinstance(other, Rock):
return other
elif isinstance(other, Paper):
return self
elif isinstance(other, Scissors):
return None
# No winner
else:
raise TypeError("Unknown second thing")
def beats2(x, y):
if hasattr(x, 'beats'):
return x.beats(y)
else:
raise TypeError("Unknown first thing")
rock, paper, scissors = DuckRock(), DuckPaper(), DuckScissors()
# >>> beats2(rock, paper)
# <__main__.DuckPaper at 0x103b894a8>
# >>> beats2(3, rock)
# TypeError: Unknown first thing
Multiple Dispatch
21
Pattern Matching
As a final try, we can express all the logic more directly using multi
ple dispatch. This should be more readable, albeit there are still a
number of cases to define:
from multipledispatch import dispatch
@dispatch(Rock, Rock)
def beats3(x, y): return None
@dispatch(Rock, Paper)
def beats3(x, y): return y
@dispatch(Rock, Scissors)
def beats3(x, y): return x
@dispatch(Paper, Rock)
def beats3(x, y): return x
@dispatch(Paper, Paper)
def beats3(x, y): return None
@dispatch(Paper, Scissors)
def beats3(x, y): return x
@dispatch(Scissors, Rock)
def beats3(x, y): return y
@dispatch(Scissors, Paper)
def beats3(x, y): return x
@dispatch(Scissors, Scissors)
def beats3(x, y): return None
@dispatch(object, object)
def beats3(x, y):
if not isinstance(x, (Rock, Paper, Scissors)):
raise TypeError("Unknown first thing")
else:
raise TypeError("Unknown second thing")
22
#
#
#
#
Callables
Predicate-Based Dispatch
A really exotic approach to expressing conditionals as dispatch deci
sions is to include predicates directly within the function signatures
(or perhaps within decorators on them, as with multipledispatch).
I do not know of any well-maintained Python library that does this,
but let us simply stipulate a hypothetical library briefly to illustrate
the concept. This imaginary library might be aptly named
predicative_dispatch:
from predicative_dispatch import predicate
@predicate(lambda x: x < 0, lambda y: True)
def sign(x, y):
print("x is negative; y is", y)
@predicate(lambda x: x == 0, lambda y: True)
def sign(x, y):
print("x is zero; y is", y)
@predicate(lambda x: x > 0, lambda y: True)
def sign(x, y):
print("x is positive; y is", y)
Multiple Dispatch
23
Lazy Evaluation
This report is not the place to try to teach Haskell, but you can see a
comprehension in there, which is in fact the model that Python used
in introducing its own comprehensions. There is also deep recursion
involved, which is not going to work in Python.
Apart from syntactic differences, or even the ability to recurse to
indefinite depth, the significant difference here is that the Haskell
version of primes is an actual (infinite) sequence, not just an object
capable of sequentially producing elements (as was the primes
object we demonstrated in the chapter entitled Callables). In par
ticular, you can index into an arbitrary element of the infinite list of
primes in Haskell, and the intermediate values will be produced
internally as needed based on the syntactic construction of the list
itself.
25
Mind you, one can replicate this in Python too, it just isnt in the
inherent syntax of the language and takes more manual construc
tion. Given the get_primes() generator function discussed earlier,
we might write our own container to simulate the same thing, for
example:
from collections.abc import Sequence
class ExpandingSequence(Sequence):
def __init__(self, it):
self.it = it
self._cache = []
def __getitem__(self, index):
while len(self._cache) <= index:
self._cache.append(next(self.it))
return self._cache[index]
def __len__(self):
return len(self._cache)
26
Lazy Evaluation
The above remarks are a bit abstract, so let us look at a few concrete
examples:
>>> lazy = open('06-laziness.md') # iterate over lines of file
>>> '__iter__' in dir(lazy) and '__next__' in dir(lazy)
True
>>> plus1 = map(lambda x: x+1, range(10))
>>> plus1
# iterate over deferred computations
<map at 0x103b002b0>
>>> '__iter__' in dir(plus1) and '__next__' in dir(plus1)
True
>>> def to10():
...
for i in range(10):
...
yield i
...
>>> '__iter__' in dir(to10)
False
>>> '__iter__' in dir(to10()) and '__next__' in dir(to10())
True
27
>>> l = [1,2,3]
>>> '__iter__' in dir(l)
True
>>> '__next__' in dir(l)
False
>>> li = iter(l)
# iterate over concrete collection
>>> li
<list_iterator at 0x103b11278>
>>> li == iter(li)
True
>>>
>>>
0
>>>
...
...
1 1
>>>
143
>>>
89
fib = Fibonacci()
fib.running_sum()
for _, i in zip(range(10), fib):
print(i, end=" ")
2 3 5 8 13 21 34 55
fib.running_sum()
next(fib)
28
Lazy Evaluation
Module: itertools
The module itertools is a collection of very powerfuland care
fully designedfunctions for performing iterator algebra. That is,
these allow you to combine iterators in sophisticated ways without
having to concretely instantiate anything more than is currently
required. As well as the basic functions in the module itself, the
module documentation provides a number of short, but easy to get
subtly wrong, recipes for additional functions that each utilize two
or three of the basic functions in combination. The third-party
module more_itertools mentioned in the Preface provides addi
tional functions that are likewise designed to avoid common pitfalls
and edge cases.
The basic goal of using the building blocks inside itertools is to
avoid performing computations before they are required, to avoid
the memory requirements of a large instantiated collection, to avoid
potentially slow I/O until it is stricly required, and so on. Iterators
are lazy sequences rather than realized collections, and when com
bined with functions or recipes in itertools they retain this prop
erty.
Here is a quick example of combining a few things. Rather than the
stateful Fibonacci class to let us keep a running sum, we might sim
ply create a single lazy iterator to generate both the current number
and this sum:
>>> def fibonacci():
...
a, b = 1, 1
...
while True:
...
yield a
...
a, b = b, a+b
...
>>> from itertools import tee, accumulate
>>> s, t = tee(fibonacci())
>>> pairs = zip(t, accumulate(s))
>>> for _, (fib, total) in zip(range(7), pairs):
...
print(fib, total)
...
1 1
1 2
2 4
3 7
5 12
8 20
13 33
Module: itertools
29
Chaining Iterables
The itertools.chain() and itertools.chain.from_iterable()
functions combine multiple iterables. Built-in zip() and iter
tools.zip_longest() also do this, of course, but in manners that
allow incremental advancement through the iterables. A conse
quence of this is that while chaining infinite iterables is valid syntac
tically and semantically, no actual program will exhaust the earlier
iterable. For example:
from itertools import chain, count
thrice_to_inf = chain(count(), count(), count())
30
Lazy Evaluation
Module: itertools
31
Higher-Order Functions
In the last chapter we saw an iterator algebra that builds on the iter
tools module. In some ways, higher-order functions (often abbrevi
ated as HOFs) provide similar building blocks to express complex
concepts by combining simpler functions into new functions. In
general, a higher-order function is simply a function that takes one or
more functions as arguments and/or produces a function as a result.
Many interesting abstractions are available here. They allow chain
ing and combining higher-order functions in a manner analogous to
how we can combine functions in itertools to produce new itera
bles.
A few useful higher-order functions are contained in the functools
module, and a few others are built-ins. It is common the think of
map(), filter(), and functools.reduce() as the most basic build
ing blocks of higher-order functions, and most functional program
ming languages use these functions as their primitives (occasionally
under other names). Almost as basic as map/filter/reduce as a build
ing block is currying. In Python, currying is spelled as partial(),
and is contained in the functools modulethis is a function that
will take another function, along with zero or more arguments to
pre-fill, and return a function of fewer arguments that operates as
the input function would when those arguments are passed to it.
The built-in functions map() and filter() are equivalent to com
prehensionsespecially now that generator comprehensions are
availableand most Python programmers find the comprehension
versions more readable. For example, here are some (almost) equiv
alent pairs:
33
# Classic "FP-style"
transformed = map(tranformation, iterator)
# Comprehension
transformed = (transformation(x) for x in iterator)
# Classic "FP-style"
filtered = filter(predicate, iterator)
# Comprehension
filtered = (x for x in iterator if predicate(x))
It may or may not be obvious that map() and filter() are also a
special cases of reduce(). That is:
>>>
>>>
[5,
>>>
>>>
>>>
[])
[1,
>>>
34
Higher-Order Functions
The library toolz has what might be a more general version of this
called juxt() that creates a function that calls several functions with
Utility Higher-Order Functions
35
The utility higher-order functions shown here are just a small selec
tion to illustrate composability. Look at a longer text on functional
programmingor, for example, read the Haskell preludefor many
other ideas on useful utility higher-order-functions.
36
Higher-Order Functions
Decorators
Although it isby designeasy to forget it, probably the most com
mon use of higher-order functions in Python is as decorators. A
decorator is just syntax sugar that takes a function as an argument,
and if it is programmed correctly, returns a new function that is in
some way an enhancement of the original function (or method, or
class). Just to remind readers, these two snippets of code defining
some_func and other_func are equivalent:
@enhanced
def some_func(*args):
pass
def other_func(*args):
pass
other_func = enhanced(other_func)
Decorators
37
and worthwhile purposes, but they are also more in the spirit of
making the plumbing of Python programming easier in a general
almost syntacticway rather than the composable higher-order
functions this chapter focuses on.
Decorators in general are more useful when you want to poke into
the guts of a function than when you want to treat it as a pluggable
component in a flow or composition of functions, often done to
mark the purpose or capabilities of a particular function.
This report has given only a glimpse into some techniques for pro
gramming Python in a more functional style, and only some sugges
tions as to the advantages one often finds in aspiring in that direc
tion. Programs that use functional programming are usually shorter
than more traditional imperative ones, but much more importantly,
they are also usually both more composable and more provably cor
rect. A large class of difficult to debug errors in program logic are
avoided by writing functions without side effects, and even more
errors are avoided by writing small units of functionality whose
operation can be understood and tested more reliably.
A rich literature on functional programming as a general technique
often in particular languages which are not Pythonis available
and well respected. Studying one of many such classic books, some
published by OReilly (including very nice video training on func
tional programming in Python), can give readers further insight into
the nitty-gritty of functional programming techniques. Almost
everything one might do in a more purely functional language can
be done with very little adjustment in Python as well.
38
Higher-Order Functions