Skip to content

[BUG] Do not invoke synchronous call() on LLM from asynchronous workflow in _export_output / converter #5230

@andrewisplinghoff

Description

@andrewisplinghoff

Description

In an asynchronous workflow started via akickoff(), there should not be synchronous calls being made as that blocks the event loop. If the output of a task requires converting to fit the desired output format, there are currently synchronous calls made to the LLM via several invocations of self.llm.call() here in the converter module (https://github.com/crewAIInc/crewAI/blob/c14abf1758dd3aafc57ad7f17569174c7cc1ea68/lib/crewai/src/crewai/utilities/converter.py), which blocks the event loop.

Calls from async methods here:

pydantic_output, json_output = self._export_output(result)

pydantic_output, json_output = self._export_output(

_export_output then delegates to the converter module here:

model_output = convert_to_model(

Steps to Reproduce

Use some example workflow that uses a Task(output_json=...)orTask(output_pydantic=...) to require a specific output format, e.g.:
https://github.com/crewAIInc/crewAI-examples/blob/b4c0ef6522c37d4148f8372a731fd2851fdcb864/flows/self_evaluation_loop_flow/src/self_evaluation_loop_flow/crews/x_post_review_crew/x_post_review_crew.py#L32

Invoke the workflow asynchronously using akickoff().

Expected behavior

Only calls to await llm.acall() should be made from an asynchronous workflow.

Screenshots/Code snippets

Operating System

Debian 12 Bookworm

Python Version

3.12

crewAI Version

1.12.2

crewAI Tools Version

Virtual Environment

Venv

Evidence

Possible Solution

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions