Я попытался переписать это, чтобы обеспечить оптимизацию хвостовой рекурсии (TCO). Я считаю, что этот код должен был быть успешным, если бы имела место TCO.
deftrisum(n, csum): if n == 0: return csum else: return trisum(n - 1, csum + n)
print(trisum(1000, 0))
Должен ли я сделать вывод, что Python не выполняет никаких типов TCO, или мне просто нужно определить это по-другому?
Переведено автоматически
Ответ 1
Нет, и никогда не будет, поскольку Гвидо ван Россум предпочитает иметь правильные обратные трассировки:
Вы можете вручную устранить рекурсию с помощью преобразования, подобного этому:
>>> deftrisum(n, csum): ... whileTrue: # Change recursion to a while loop ... if n == 0: ... return csum ... n, csum = n - 1, csum + n # Update parameters instead of tail recursion
>>> trisum(1000,0) 500500
Ответ 2
Я опубликовал модуль, выполняющий оптимизацию хвостового вызова (обрабатывающий как хвостовую рекурсию, так и стиль передачи продолжения): https://github.com/baruchel/tco
Оптимизация хвостовой рекурсии в Python
Часто утверждалось, что хвостовая рекурсия не подходит для Pythonic способа кодирования и что не следует заботиться о том, как встроить ее в цикл. Я не хочу спорить с этой точкой зрения; однако иногда мне нравится пробовать или реализовывать новые идеи в виде хвостово-рекурсивных функций, а не с помощью циклов по разным причинам (фокусируясь на идее, а не на процессе, имея на экране двадцать коротких функций одновременно, а не только три "питоновские" функции, работая в интерактивном сеансе, а не редактируя свой код, и т.д.).
Optimizing tail-recursion in Python is in fact quite easy. While it is said to be impossible or very tricky, I think it can be achieved with elegant, short and general solutions; I even think that most of these solutions don't use Python features otherwise than they should. Clean lambda expressions working along with very standard loops lead to quick, efficient and fully usable tools for implementing tail-recursion optimization.
As a personal convenience, I wrote a small module implementing such an optimization by two different ways. I would like to discuss here about my two main functions.
The clean way: modifying the Y combinator
The Y combinator is well known; it allows to use lambda functions in a recursive manner, but it doesn't allow by itself to embed recursive calls in a loop. Lambda calculus alone can't do such a thing. A slight change in the Y combinator however can protect the recursive call to be actually evaluated. Evaluation can thus be delayed.
Here is the famous expression for the Y combinator:
Instead of calling itself, the function f now returns a function performing the very same call, but since it returns it, the evaluation can be done later from outside.
My code is:
defbet(func): b = (lambda f: (lambda x: x(x))(lambda y: f(lambda *args: lambda: y(y)(*args))))(func) defwrapper(*args): out = b(*args) whilecallable(out): out = out() return out return wrapper
The function can be used in the following way; here are two examples with tail-recursive versions of factorial and Fibonacci:
>>> from recursion import * >>> fac = bet( lambda f: lambda n, a: a ifnot n else f(n-1,a*n) ) >>> fac(5,1) 120 >>> fibo = bet( lambda f: lambda n,p,q: p ifnot n else f(n-1,q,p+q) ) >>> fibo(10,0,1) 55
Obviously recursion depth isn't an issue any longer:
This is of course the single real purpose of the function.
Only one thing can't be done with this optimization: it can't be used with a tail-recursive function evaluating to another function (this comes from the fact that callable returned objects are all handled as further recursive calls with no distinction). Since I usually don't need such a feature, I am very happy with the code above. However, in order to provide a more general module, I thought a little more in order to find some workaround for this issue (see next section).
Concerning the speed of this process (which isn't the real issue however), it happens to be quite good; tail-recursive functions are even evaluated much quicker than with the following code using simpler expressions:
defbet1(func): defwrapper(*args): out = func(lambda *x: lambda: x)(*args) whilecallable(out): out = func(lambda *x: lambda: x)(*out()) return out return wrapper
I think that evaluating one expression, even complicated, is much quicker than evaluating several simple expressions, which is the case in this second version. I didn't keep this new function in my module, and I see no circumstances where it could be used rather than the "official" one.
Continuation passing style with exceptions
Here is a more general function; it is able to handle all tail-recursive functions, including those returning other functions. Recursive calls are recognized from other return values by the use of exceptions. This solutions is slower than the previous one; a quicker code could probably be written by using some special values as "flags" being detected in the main loop, but I don't like the idea of using special values or internal keywords. There is some funny interpretation of using exceptions: if Python doesn't like tail-recursive calls, an exception should be raised when a tail-recursive call does occur, and the Pythonic way will be to catch the exception in order to find some clean solution, which is actually what happens here...
Now all functions can be used. In the following example, f(n) is evaluated to the identity function for any positive value of n:
>>> f = bet0( lambda f: lambda n: (lambda x: x) ifnot n else f(n-1) ) >>> f(5)(42) 42
Of course, it could be argued that exceptions are not intended to be used for intentionally redirecting the interpreter (as a kind of goto statement or probably rather a kind of continuation passing style), which I have to admit. But, again, I find funny the idea of using try with a single line being a return statement: we try to return something (normal behaviour) but we can't do it because of a recursive call occurring (exception).
It can embed a lambda function written with a tail recursion style in another function which will evaluate it as a loop.
The most interesting feature in this small function, in my humble opinion, is that the function doesn't rely on some dirty programming hack but on mere lambda calculus: the behaviour of the function is changed to another one when inserted in another lambda function which looks very like the Y combinator.
I recently posted an entry in my Python History blog on the origins of Python's functional features. A side remark about not supporting tail recursion elimination (TRE) immediately sparked several comments about what a pity it is that Python doesn't do this, including links to recent blog entries by others trying to "prove" that TRE can be added to Python easily. So let me defend my position (which is that I don't want TRE in the language). If you want a short answer, it's simply unpythonic. Here's the long answer:
Ответ 4
CPython does not and will probably never support tail call optimization based on Guido van Rossum's statements on the subject.
I've heard arguments that it makes debugging more difficult because of how it modifies the stack trace.