Skip to content

Commit 7586376

Browse files
vdusekclaude
andcommitted
fix(test): fix flaky adaptive playwright crawler timeout test
On slow Windows CI, Playwright browser startup exceeded the 10s relaxed timeout, causing the browser fallback to fail. BasicCrawler then retried the request, and the retry's static crawl succeeded (using the mutated timeout), calling the static handler a second time. Fix: increase relaxed timeout to 120s for slow CI, and set max_request_retries=0 since the test validates fallback, not retries. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
1 parent f9a8ea3 commit 7586376

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

tests/unit/crawlers/_adaptive_playwright/test_adaptive_playwright_crawler.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -579,7 +579,7 @@ async def test_adaptive_playwright_crawler_timeout_in_sub_crawler(test_urls: lis
579579
request_handler_timeout = timedelta(seconds=1)
580580

581581
crawler = AdaptivePlaywrightCrawler.with_beautifulsoup_static_parser(
582-
max_request_retries=1,
582+
max_request_retries=0,
583583
rendering_type_predictor=static_only_predictor_no_detection,
584584
request_handler_timeout=request_handler_timeout,
585585
)
@@ -594,8 +594,8 @@ async def request_handler(context: AdaptivePlaywrightCrawlingContext) -> None:
594594
mocked_browser_handler()
595595
except AdaptiveContextError:
596596
mocked_static_handler()
597-
# Relax timeout for the fallback browser request to avoid flakiness in test
598-
crawler._request_handler_timeout = timedelta(seconds=10)
597+
# Relax timeout for the fallback browser request to allow for slow browser startup on CI
598+
crawler._request_handler_timeout = timedelta(seconds=120)
599599
# Sleep for time obviously larger than top crawler timeout.
600600
await asyncio.sleep(request_handler_timeout.total_seconds() * 3)
601601

0 commit comments

Comments
 (0)