Skip to content

But why...?

We made some design choices that might appear dubious. This page documents them, and explains the technical trade-offs that lead to them.

Why is the task decorator so awkward?

If you want to define a Task through a decorator, this is the only way to do it:

from conatus.tasks import task
from my_pkg import CustomTask, custom_actions

@task(using=CustomTask, actions=custom_actions)
def fn(x: int) -> int:
    ...

But why can't we just do this instead?

@CustomTask
def custom_task_fn(x: int) -> int:
    ...

# Or fn2
@CustomTask(actions=custom_actions)
def custom_task_fn2(x: int) -> int:
    ...

The problem

The reason has to do with __call__. When you use a decorator that is a class, either:

  1. The decorator is passed with no arguments, and the function is passed as an argument to __new__. The type of custom_task_fn here would be the output type of CustomTask.__new__, which, presumably, would be CustomTask.
  2. The decorator is passed with arguments, and the function is passed as an argument to __call__. The type of custom_task_fn2 here would be the output type of CustomTask.__call__.

Simple, in appearance: we just need to overload CustomTask.__call__:

  1. If a function is passed to it, it returns a CustomTask instance.
  2. Otherwise, we assume the user wants to call the CustomTask instance like a function. We return the expected type of the underlying function (int in this case).

It would look like this:

# CustomTask.__new__ returns a CustomTask
custom_task_instance: CustomTask = CustomTask(actions=custom_actions)

# CustomTask.__call__ with a function argument
my_custom_task_fn: CustomTask = custom_task_instance(custom_task_fn)

# CustomTask.__call__ with the arguments of the underlying function
result: int = my_custom_task_fn(x=5)

But overloading __call__ properly (so that it can emulate the function call or serve as a decorator) is immensely complicated. Among other factors:

  • Putting Ps.args and Ps.kwargs in the function signature creates many constraints for the other arguments. (For instance, you cannot write something like *args: Ps.args | None, because ParamSpec arguments need to be unencumbered in function signatures).
  • Having a __call__ function that can potentially endlessly replicate itself seems to create infinite loops in type inference (in some implementations, Pyright takes forever)
  • These problems are compounded if you want to allow the user to use the decorator either with or without arguments; but even a more restrictive version (you can call the class as decorator only devoid of arguments) is still subject to the aforementioned issues.
  • In general, type inference rarely works properly. One of the limitations of all the options I tried, regardless on how well they performed, was that it forced the user to call their decorated functions with keyword arguments only. See why below:
from __future__ import annotations

from collections.abc import Callable
from typing import Generic, ParamSpec, TypeVar, cast, overload, reveal_type

P = ParamSpec("P")
R = TypeVar("R")
P2 = ParamSpec("P2")
R2 = TypeVar("R2")

ParamType = object | None


class Decorator(Generic[P, R]):
    func: Callable[P, R]

    def __init__(
        self, func: Callable[P, R] | None = None, **kwargs: ParamType
    ) -> None:
        """Initialize the decorator.

        Args:
            func: The function to decorate.
            **kwargs: Additional keyword arguments.
        """
        print(f"kwargs passed: {kwargs}")
        if func is not None and callable(func):
            self.func = func

    @overload # (1)!
    def __call__(self, func: Callable[P2, R2], /) -> Decorator[P2, R2]: ...

    @overload
    def __call__(
        self, func: None = None, *args: P.args, **kwargs: P.kwargs
    ) -> R: ...

    def __call__(  # pyright: ignore[reportInconsistentOverload]
        self,
        func: Callable[P2, R2] | None = None,
        *args: P.args,
        **kwargs: P.kwargs,
    ) -> Decorator[P2, R2] | R:
        """Single implementation that covers both cases."""
        if func is not None:
            self.func = cast("Callable[P, R]", func) # (2)!
            return cast("Decorator[P2, R2]", self)
        return self._run_decorated(*args, **kwargs)

    def _run_decorated(self, *args: P.args, **kwargs: P.kwargs) -> R:
        print("calling the function with real args")
        return self.func(*args, **kwargs)


@Decorator
def foo(x: int) -> str:
    return str(x)


reveal_type(foo)  # Shows: Decorator[(x: int), str]
foo_result = foo(x=10)
reveal_type(foo_result)  # Shows: str
foo_result_no_kwargs = foo(10) # (3)!
reveal_type(foo_result_no_kwargs)  # Shows: Decorator[..., Unknown]


@Decorator()
def zig(param: float) -> str:
    return f"zig {param}"

reveal_type(zig)  # Shows: Decorator[(param: float), str]
zig_result = zig(param=10)
reveal_type(zig_result)  # Shows: str
zig_result_no_kwargs = zig(10)
reveal_type(zig_result_no_kwargs)  # Shows: Decorator[..., Unknown]
zig_result_passing_none_first = zig(None, 10) # (4)!
reveal_type(zig_result_passing_none_first)  # Shows: str
  1. You can see here that we have two overloads: one that takes a function and returns a decorator, and one that takes no arguments and returns the result of the decorated function. But in order to distinguish between the two, we need to mandate the func argument to be passed as a positional argument.

  2. Unorthodox, but fine.

  3. You can see here that the type inference fails because it wants the first positional argument to be None or a function. And so it will not accept that 10 here is meant for x.

  4. This is fine though! But not what we want.

The choice

This means I've found no way to create a two-headed __call__ that plays nice with type inference. This leads to the following tradeoff:

  • Option 1: We make the class decorator "clean", meaning that __call__ can only be used as a decorator, and not as a function call. This would lead to users having to explicitly use something like Task.run to run the task like a function call. But this would go against the Pythonic aesthetic of Conatus: why would you even decorate a function if you can't use it like a function later on?
  • Option 2: We constrain the user of the class decorator to a specific, and more awkward syntax. Given the importance of keeping a Pythonic aesthetic, this is the option we've chosen.

Considered alternative

For the record, an alternative considered was one where we split the decorator logic between two classes:

  • FnDecorator - The class that would be used to decorate the function. In this class, __call__ would be used to process the function.
  • DecoratedFn - The class that would be a wrapper around the decorated function. In this class, __call__ would behave like the function itself.

This implementation passes muster with pyright, but not with mypy. The reason is we essentially hijack the __new__ function and potentially return an object that is not of type Self. Not only does mypy throw an error when that happens -- its type inference essentially refuses to process __call__ with that other type.

Pyright-friendly alternative
from functools import wraps
from typing import (
    Generic,
    ParamSpec,
    Protocol,
    Self,
    TypeVar,
    overload,
)

# Type variables for generic parameters and return type
P = ParamSpec("P")
R = TypeVar("R")
R_co = TypeVar("R_co", covariant=True)
T = TypeVar("T")

ParamType = object | None


# Protocol for callable objects
class Callable_P_R(Protocol[P, R_co]):
    def __call__(self, *args: P.args, **kwargs: P.kwargs) -> R_co: ...


class DecoratedFn(Generic[T, P, R]):
    """
    A decorated function that maintains type information about
    its decorator, parameters, and return type.
    """

    def __init__(
        self,
        decorator: type[T],
        fn: Callable_P_R[P, R],
    ):
        self.decorator: type[T] = decorator
        self.fn: Callable_P_R[P, R] = fn

        # Preserve the original function's metadata
        _ = wraps(fn)(self)

    def __call__(self, *args: P.args, **kwargs: P.kwargs) -> R:
        """Calls the underlying function with the decorator's logic applied.
        Maintains full type safety for parameters and return type.
        """
        return self.fn(*args, **kwargs)


class FnDecorator:
    """A base class for creating function decorators
    that can be used with or without arguments
    and maintains type information.
    """

    @overload
    def __new__(cls, func: None = None, **kwargs: ParamType) -> Self: ...

    @overload
    def __new__(
        cls, func: Callable_P_R[P, R], /
    ) -> DecoratedFn[Self, P, R]: ...

    def __new__(
        cls, func: Callable_P_R[P, R] | None = None, **kwargs: ParamType
    ) -> Self | DecoratedFn[Self, P, R]:
        if func is None:
            return super().__new__(cls)
        return DecoratedFn[Self, P, R](cls, func)

    def __call__(self, fn: Callable_P_R[P, R]) -> DecoratedFn[Self, P, R]:
        return self._decorate(fn)

    def _decorate(self, fn: Callable_P_R[P, R]) -> DecoratedFn[Self, P, R]:
        """
        Creates a DecoratedFn instance with the given function and
        decorator arguments.
        """
        return DecoratedFn[Self, P, R](type(self), fn)


# Example usage
if __name__ == "__main__":
    # As a simple decorator
    @FnDecorator
    def greet(name: str) -> str:
        return f"Hello, {name}!"

    # As a decorator with arguments
    @FnDecorator(prefix="DEBUG")
    def add(a: int, b: int) -> int:
        return a + b

    # As a direct function call
    def multiply(a: int, b: int) -> int:
        return a * b

    decorated_multiply = FnDecorator(multiply)

    reveal_type(greet)
    reveal_type(add)
    reveal_type(decorated_multiply)

    reveal_type(greet("alice"))

    # Test the decorated functions
    print(greet("Alice"))  # Type-safe, expects str, returns str
    print(add(2, 3))  # Type-safe, expects int, returns int
    print(decorated_multiply(4, 5))  # Type-safe, expects int, returns int