Source From Here
PrefaceTake your Python functions to the next level with functools (notebook of this article)
Last month, I wrote an article about some modules in the Python standard library that I have found to be incredibly useful in my programming and Data Science career. The article had such a great reception with quality feedback that I decided to double down on the idea and write another article discussing the same topic with a lot more standard library tools that everyone should be familiar with. It turns out that the Python programming language’s base is actually quite inclusive, and includes a lot of great tools for various programming challenges. If you would like to read either one of those articles, you can check them out here:
When going over a lot of these tools, it seemed like some should have an entire article dedicated to them, rather than just a little overview that is required for most of the other modules that were presented with it. The tool that I think most reciprocated this idea to me is the functools module. This is an incredibly powerful module that can be used to improve nearly any function in Python by utilizing simple and classic methods, such as utilizing stack over processor speed. While in some circumstances I could see that being a big negative, there are certainly exceptions to that case.
Cache
Probably the coolest thing that the functools module provides is the ability to cache certain calculations inside of memory rather than throwing them away just to recalculate them later.
This is a great way to save processing times, especially if you find yourself in a Python3 time-out scenario where you are unable to get your code interpreted. While this comes with the trade-off of using a lot more memory, it can certainly make sense to utilize it in many different situations. The Python programming language itself is quite declarative, and usually, the interpreter handles all of the memory management for us. While this is a less efficient method of programming, it also removes a lot of the hassle of allocating memory and things of that nature. Using functools, we can somewhat change this by determining some things about what lands in the stack and what will be recalculated ourselves.
What is great about the caching that functools provides is that it is both easy to use and allows one to take better control of the interpreter underneath their code. Taking advantage of this awesome feature is as simple as calling it above your function. Here is an example with factorial calculation that I really think takes advantage of what this can accomplish quite well:
- import sys
- sys.setrecursionlimit(100000)
- def factorial(n):
- return n * factorial(n-1) if n else 1
- from functools import lru_cache
- @lru_cache
- def factorial_with_cache(n):
- return n * factorial_with_cache(n-1) if n else 1
That being said, if you are going to be using recursion like this, it might be a great idea to start getting familiar with functools. This standard library tool can bring a lot of speed to problems that are typically very difficult for Python to solve. In a way, it really brings me back to the Numba Python compiler, where one simple call will instantly make your code faster. If you would like to read an article I wrote all about that a while back, you can check it out here:
Key Functions
Has there ever been a time where you really wanted to use a function from some really old Python code, but the function was compiled to be a comparison function? These types of functions aren’t as well-supported or even used anymore in modern Python 3, and it can be quite difficult to translate one function type into another. That being said, functools is here to easily fix that with another simple method call.
Partial
The partial function will return a new partial object which can then later be called using the same exact arguments and will behave exactly like our previous func. The function inside the code ends up looking a little something like this:
- def partial(func, /, *args, **keywords):
- def newfunc(*fargs, **fkeywords):
- newkeywords = {**keywords, **fkeywords}
- return func(*args, *fargs, **newkeywords)
- newfunc.func = func
- newfunc.args = args
- newfunc.keywords = keywords
- return newfunc
- from functools import partial
- def add(a, b):
- return a + b
- add_one = partial(add, b=1)
- add_one(5) # Return 5
Reduce
The reduce function will apply the function of two arguments with cumulative iteration. That in mind, we will need our arguments to be iterables. For this example, I am going to be using a generator, range. This will make it super easy for us to construct a list datatype of essentially any length we desire:
- from functools import reduce
- reduce(lambda x, y: x+y, [1, 2, 3, 4, 5]) # Output 15
Dispatch
Okay — so this is really, really cool.
Readers of my blog will likely note that I am a huge fan of the Julia programming language. This is not only because Julia is an awesome, high-performance scientific programming language, but also because I have a particular attraction to multiple dispatches. The concept is something that I genuinely miss when it is not around, and it makes programming feel almost more natural. In a way, it is great to have the functions be more type-based, rather than just have different calls in order to handle different situations.
The functools module for Python actually provides a way that Python can effectively be made into a programming language with multiple dispatch. Below is a example of using singledispatch decorator:
- from functools import singledispatch
- @singledispatch
- def add(a, b):
- raise NotImplementedError('Unsupported type')
- @add.register(int)
- def _(a, b):
- print("First argument is of type ", type(a))
- print(a + b)
- @add.register(str)
- def _(a, b):
- print("First argument is of type ", type(a))
- print(a + b)
- @add.register(list)
- def _(a, b):
- print("First argument is of type ", type(a))
- print(a + b)
Supplement
* Python 3 – Function Overloading with singledispatch
沒有留言:
張貼留言