Skip to content

Commit a534b13

Browse files
pull[bot]renovate[bot]B4nanSajitojanbuchar
authored
[pull] master from apify:master (#1)
* chore(deps): lock file maintenance * chore(deps): lock file maintenance * chore(deps): lock file maintenance * chore(deps): lock file maintenance * ci: test on node 22 (apify#2438) * chore: use node 20 in templates * chore(deps): update yarn to v4.2.1 * chore(deps): lock file maintenance * fix: return true when robots.isAllowed returns undefined (apify#2439) `undefined` means that there is no explicit rule for the requested route. No rules means no disallow, therefore it's allowed. Fixes apify#2437 --------- Co-authored-by: Jan Buchar <Teyras@gmail.com> * chore(deps): update patch/minor dependencies to v3.3.0 * chore(deps): update patch/minor dependencies to v3.3.2 * chore(deps): lock file maintenance * chore(deps): lock file maintenance * docs: Should be "Same Domain" not "Same Subdomain" (apify#2445) The docs appear to be a bit misleading. If people want "Same Subdomain" they should actually use "Same Hostname". ![image](https://github.com/apify/crawlee/assets/10026538/2b5452c5-e313-404b-812d-811e0764bd2d) * chore(docker): update docker state [skip ci] * docs: fix two typos (array or requests -> array of requests, no much -> not much) (apify#2451) * fix: sitemap `content-type` check breaks on `content-type` parameters (apify#2442) According to the [RFC1341](https://www.w3.org/Protocols/rfc1341/4_Content-Type.html), the Content-type header can contain additional string parameters. * chore(docker): update docker state [skip ci] * chore(deps): lock file maintenance * fix(core): fire local `SystemInfo` events every second (apify#2454) During local development, we are firing events for the AutoscaledPool about current system resources like memory or CPU. We were firing them once a minute by default, but we remove those snapshots older than 30s, so we never had anything to compare and always used only the very last piece of information. This PR changes the interval to 1s, aligning this with how the Apify platform fires events. * chore(deps): lock file maintenance * chore(deps): lock file maintenance * chore(deps): lock file maintenance * chore(deps): lock file maintenance * chore(deps): lock file maintenance * chore(deps): lock file maintenance * chore(deps): update dependency linkedom to ^0.18.0 (apify#2457) * chore(docker): update docker state [skip ci] * perf: optimize adding large amount of requests via `crawler.addRequests()` (apify#2456) This PR resolves three main issues with adding large amount of requests into the queue: - Every requests added to the queue was automatically added to the LRU requests cache, which has a size of 1 million items. this makes sense for enqueuing a few items, but if we try to add more than the limit, we end up with overloading the LRU cache for no reason. Now we only add the first 1000 requests to the cache (plus any requests added via separate calls, e.g. when doing `enqueueLinks` from inside a request handler, again with a limit of the first 1000 links). - We used to validate the whole requests array via `ow`, and since the shape can vary, it was very slow (e.g. 20s just for the `ow` validation). Now we use a tailored validation for the array that does the same but resolves within 100ms or so. - We always created the `Request` objects out of everything, which had a significant impact on memory usage. Now we skip this completely and let the objects be created later when needed (when calling `RQ.addRequests()` which only receives the actual batch and not the whole array) Related: https://apify.slack.com/archives/C0L33UM7Z/p1715109984834079 * perf: improve scaling based on memory (apify#2459) We only allowed to use 70% of the available memory, this PR changes the limit to 90%. Tested with a low memory options and it did not have any effect, while it allows to use more memory on the large memory setups - where the 30% could mean 2gb or so, we dont need such a huge buffer. Also increases the scaling steps to 10% instead of 5% so speed up the scaling. Related: [apify.slack.com/archives/C0L33UM7Z/p1715109984834079](https://apify.slack.com/archives/C0L33UM7Z/p1715109984834079) * feat: make `RequestQueue` v2 the default queue, see more on [Apify blog](https://blog.apify.com/new-apify-request-queue/) (apify#2390) Closes apify#2388 --------- Co-authored-by: drobnikj <drobnik.j@gmail.com> Co-authored-by: Martin Adámek <banan23@gmail.com> * fix: do not drop statistics on migration/resurrection/resume (apify#2462) This fixes a bug that was introduced with apify#1844 and apify#2083 - we reset the persisted state for statistics and session pool each time a crawler is started, which prevents their restoration. --------- Co-authored-by: Martin Adámek <banan23@gmail.com> * chore(deps): update patch/minor dependencies (apify#2450) * chore(docker): update docker state [skip ci] * fix: double tier decrement in tiered proxy (apify#2468) * docs: scrapy-vs-crawlee blog (apify#2431) Co-authored-by: Saurav Jain <sauain@SauravApify.local> Co-authored-by: davidjohnbarton <41335923+davidjohnbarton@users.noreply.github.com> * perf: optimize `RequestList` memory footprint (apify#2466) The request list now delays the conversion of the source items into the `Request` objects, resulting in a significantly less memory footprint. Related: https://apify.slack.com/archives/C0L33UM7Z/p1715109984834079 * fix: `EnqueueStrategy.All` erroring with links using unsupported protocols (apify#2389) This changes `EnqueueStrategy.All` to filter out non-http and non-https URLs (`mailto:` links were causing the crawler to error). Let me know if there's a better fix or if you want me to change something. Thanks! ``` Request failed and reached maximum retries. Error: Received one or more errors at _ArrayValidator.handle (/path/to/project/node_modules/@sapphire/shapeshift/src/validators/ArrayValidator.ts:102:17) at _ArrayValidator.parse (/path/to/project/node_modules/@sapphire/shapeshift/src/validators/BaseValidator.ts:103:2) at RequestQueueClient.batchAddRequests (/path/to/project/node_modules/@crawlee/src/resource-clients/request-queue.ts:340:36) at RequestQueue.addRequests (/path/to/project/node_modules/@crawlee/src/storages/request_provider.ts:238:46) at RequestQueue.addRequests (/path/to/project/node_modules/@crawlee/src/storages/request_queue.ts:304:22) at attemptToAddToQueueAndAddAnyUnprocessed (/path/to/project/node_modules/@crawlee/src/storages/request_provider.ts:302:42) at RequestQueue.addRequestsBatched (/path/to/project/node_modules/@crawlee/src/storages/request_provider.ts:319:37) at RequestQueue.addRequestsBatched (/path/to/project/node_modules/@crawlee/src/storages/request_queue.ts:309:22) at enqueueLinks (/path/to/project/node_modules/@crawlee/src/enqueue_links/enqueue_links.ts:384:2) at browserCrawlerEnqueueLinks (/path/to/project/node_modules/@crawlee/src/internals/browser-crawler.ts:777:21) ``` * fix(core): use createSessionFunction when loading Session from persisted state (apify#2444) Changes SessionPool's new Session loading behavior in the core module to utilize the configured createSessionFunction if specified. This ensures that new Sessions are instantiated using the custom session creation logic provided by the user, improving flexibility and adherence to user configurations. * fix(core): conversion between tough cookies and browser pool cookies (apify#2443) Fixes the conversion from tough cookies to browser pool cookies and vice versa, by correctly handling cookies where the domain has a leading dot versus when it doesn't. * test: fix e2e tests for zero concurrency * chore(deps): update dependency puppeteer to v22.8.2 * chore(docker): update docker state [skip ci] * docs: fixes (apify#2469) @B4nan minor fixes * chore(deps): update dependency puppeteer to v22.9.0 * feat: implement ErrorSnapshotter for error context capture (apify#2332) This commit introduces the ErrorSnapshotter class to the crawlee package, providing functionality to capture screenshots and HTML snapshots when an error occurs during web crawling. This functionality is opt-in, and can be enabled via the crawler options: ```ts const crawler = new BasicCrawler({ // ... statisticsOptions: { saveErrorSnapshots: true, }, }); ``` Closes apify#2280 --------- Co-authored-by: Martin Adámek <banan23@gmail.com> * test: fix e2e tests for error snapshotter * feat: add `FileDownload` "crawler" (apify#2435) Adds a new package `@crawlee/file-download`, which overrides the `HttpCrawler`'s MIME type limitations and allows the users to download arbitrary files. Aside from the regular `requestHandler`, this crawler introduces `streamHandler`, which passes a `ReadableStream` with the downloaded data to the user handler. --------- Co-authored-by: Martin Adámek <banan23@gmail.com> Co-authored-by: Jan Buchar <jan.buchar@apify.com> * chore(release): v3.10.0 * chore(release): update internal dependencies [skip ci] * chore(docker): update docker state [skip ci] * docs: add v3.10 snapshot * docs: fix broken link for a moved content * chore(deps): lock file maintenance * docs: improve crawlee seo ranking (apify#2472) * chore(deps): lock file maintenance * refactor: Remove redundant fields from `StatisticsPersistedState` (apify#2475) Those fields are duplicated in the base class anyway. * chore(deps): lock file maintenance * fix: provide URLs to the error snapshot (apify#2482) This will respect the Actor SDK override automatically since importing the SDK will fire this side effect: https://github.com/apify/apify-sdk-js/blob/master/packages/apify/src/key_value_store.ts#L25 * docs: update keywords (apify#2481) Co-authored-by: Saurav Jain <sauain@SauravApify.local> * docs: add feedback from community. (apify#2478) Co-authored-by: Saurav Jain <sauain@SauravApify.local> Co-authored-by: Martin Adámek <banan23@gmail.com> Co-authored-by: davidjohnbarton <41335923+davidjohnbarton@users.noreply.github.com> * chore: use biome for code formatting (apify#2301) This takes ~50ms on my machine 🤯 - closes apify#2366 - Replacing spaces with tabs won't be done right here, right now. - eslint and biome are reconciled - ~biome check fails because of typescript errors - we can either fix those or find a way to ignore it~ * chore(docker): update docker state [skip ci] * test: Check if the proxy tier drops after an amount of successful requests (apify#2490) * chore: ignore docker state when checking formatting (apify#2491) * chore: remove unused eslint ignore directives * chore: fix formatting * chore: run biome as a pre-commit hook (apify#2493) * fix: adjust `URL_NO_COMMAS_REGEX` regexp to allow single character hostnames (apify#2492) Closes apify#2487 * fix: investigate and temp fix for possible 0-concurrency bug in RQv2 (apify#2494) * test: add e2e test for zero concurrency with RQ v2 * chore: update biome * chore(docker): update docker state [skip ci] * chore(deps): lock file maintenance (apify#2495) * chore(release): v3.10.1 * chore(release): update internal dependencies [skip ci] * chore(docker): update docker state [skip ci] * chore: add undeclared dependency * chore(deps): update patch/minor dependencies to v1.44.1 * chore(deps): lock file maintenance * chore(docker): update docker state [skip ci] * feat: Loading sitemaps from string (apify#2496) - closes apify#2460 * docs: fix homepage gradients (apify#2500) * fix: Autodetect sitemap filetype from content (apify#2497) - closes apify#2461 * chore(deps): update dependency puppeteer to v22.10.0 * chore(deps): lock file maintenance --------- Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com> Co-authored-by: Martin Adámek <banan23@gmail.com> Co-authored-by: Gigino Chianese <Sajito@users.noreply.github.com> Co-authored-by: Jan Buchar <Teyras@gmail.com> Co-authored-by: Connor Adams <connorads@users.noreply.github.com> Co-authored-by: Apify Release Bot <noreply@apify.com> Co-authored-by: Jiří Spilka <jiri.spilka@apify.com> Co-authored-by: Jindřich Bär <jindrichbar@gmail.com> Co-authored-by: Vlad Frangu <kingdgrizzle@gmail.com> Co-authored-by: drobnikj <drobnik.j@gmail.com> Co-authored-by: Jan Buchar <jan.buchar@apify.com> Co-authored-by: Saurav Jain <souravjain540@gmail.com> Co-authored-by: Saurav Jain <sauain@SauravApify.local> Co-authored-by: davidjohnbarton <41335923+davidjohnbarton@users.noreply.github.com> Co-authored-by: Stefan Sundin <git@stefansundin.com> Co-authored-by: Gustavo Silva <silva95gustavo@gmail.com> Co-authored-by: Hamza Alwan <ihamzaalwan@gmail.com>
1 parent bb97389 commit a534b13

File tree

552 files changed

+354994
-7388
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

552 files changed

+354994
-7388
lines changed

.editorconfig

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,4 @@ charset = utf-8
77
trim_trailing_whitespace = true
88
insert_final_newline = true
99
end_of_line = lf
10-
# editorconfig-tools is unable to ignore longs strings or urls
11-
max_line_length = null
1210
quote_type = single

.eslintrc.json

Lines changed: 69 additions & 64 deletions
Original file line numberDiff line numberDiff line change
@@ -1,66 +1,71 @@
11
{
2-
"root": true,
3-
"env": {
4-
"browser": true,
5-
"es2020": true,
6-
"node": true
7-
},
8-
"extends": "@apify/eslint-config-ts",
9-
"parserOptions": {
10-
"project": "./tsconfig.eslint.json",
11-
"ecmaVersion": 2022
12-
},
13-
"ignorePatterns": [
14-
"node_modules",
15-
"dist",
16-
"coverage",
17-
"**/*.d.ts"
18-
],
19-
"overrides": [
20-
{
21-
"plugins": [
22-
"@typescript-eslint"
23-
],
24-
"files": [
25-
"*.ts"
26-
],
27-
"rules": {
28-
"@typescript-eslint/array-type": "error",
29-
"@typescript-eslint/ban-ts-comment": 0,
30-
"@typescript-eslint/consistent-type-imports": ["error", {
31-
"disallowTypeAnnotations": false
32-
}],
33-
"@typescript-eslint/consistent-type-definitions": ["error", "interface"],
34-
"@typescript-eslint/member-delimiter-style": ["error", {
35-
"multiline": { "delimiter": "semi", "requireLast": true },
36-
"singleline": { "delimiter": "semi", "requireLast": false }
37-
}],
38-
"@typescript-eslint/no-empty-interface": "off",
39-
"no-empty-function": "off",
40-
"@typescript-eslint/no-empty-function": "off",
41-
"@typescript-eslint/no-explicit-any": "off",
42-
"@typescript-eslint/no-floating-promises": "error",
43-
"@typescript-eslint/no-unused-vars": "off",
44-
"@typescript-eslint/comma-dangle": ["error", "always-multiline"]
45-
}
46-
}
47-
],
48-
"rules": {
49-
"quote-props": ["error", "consistent"],
50-
"import/no-extraneous-dependencies": "off",
51-
"max-classes-per-file": 0,
52-
"no-console": "error",
53-
"no-underscore-dangle": 0,
54-
"no-void": 0,
55-
"max-len": ["error", {
56-
"code": 160,
57-
"ignoreUrls": true,
58-
"ignoreComments": true
59-
}],
60-
"import/order": ["error", {
61-
"groups": ["builtin", "external", ["parent", "sibling"], "index", "object"],
62-
"alphabetize": { "order": "asc", "caseInsensitive": true },
63-
"newlines-between": "always"
64-
}]
65-
}
2+
"root": true,
3+
"env": {
4+
"browser": true,
5+
"es2020": true,
6+
"node": true
7+
},
8+
"extends": ["@apify/eslint-config-ts", "prettier"],
9+
"parserOptions": {
10+
"project": "./tsconfig.eslint.json",
11+
"ecmaVersion": 2022
12+
},
13+
"ignorePatterns": ["node_modules", "dist", "coverage", "**/*.d.ts"],
14+
"overrides": [
15+
{
16+
"plugins": ["@typescript-eslint"],
17+
"files": ["*.ts"],
18+
"rules": {
19+
"@typescript-eslint/array-type": "error",
20+
"@typescript-eslint/ban-ts-comment": 0,
21+
"@typescript-eslint/consistent-type-imports": [
22+
"error",
23+
{
24+
"disallowTypeAnnotations": false
25+
}
26+
],
27+
"@typescript-eslint/consistent-type-definitions": [
28+
"error",
29+
"interface"
30+
],
31+
"@typescript-eslint/member-delimiter-style": [
32+
"error",
33+
{
34+
"multiline": { "delimiter": "semi", "requireLast": true },
35+
"singleline": { "delimiter": "semi", "requireLast": false }
36+
}
37+
],
38+
"@typescript-eslint/no-empty-interface": "off",
39+
"no-empty-function": "off",
40+
"@typescript-eslint/no-empty-function": "off",
41+
"@typescript-eslint/no-explicit-any": "off",
42+
"@typescript-eslint/no-floating-promises": "error",
43+
"@typescript-eslint/no-unused-vars": "off",
44+
"@typescript-eslint/comma-dangle": ["error", "always-multiline"]
45+
}
46+
}
47+
],
48+
"rules": {
49+
"quote-props": ["error", "consistent"],
50+
"import/no-extraneous-dependencies": "off",
51+
"max-classes-per-file": 0,
52+
"no-console": "error",
53+
"no-underscore-dangle": 0,
54+
"no-void": 0,
55+
"max-len": "off",
56+
"import/order": [
57+
"error",
58+
{
59+
"groups": [
60+
"builtin",
61+
"external",
62+
["parent", "sibling"],
63+
"index",
64+
"object"
65+
],
66+
"alphabetize": { "order": "asc", "caseInsensitive": true },
67+
"newlines-between": "always"
68+
}
69+
]
70+
}
6671
}

.github/workflows/release.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ jobs:
3333
matrix:
3434
# We don't test on Windows as the tests are flaky
3535
os: [ubuntu-latest]
36-
node-version: [16, 18, 20]
36+
node-version: [16, 18, 20, 22]
3737

3838
runs-on: ${{ matrix.os }}
3939

.github/workflows/test-ci.yml

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ jobs:
2323
# tests on windows are extremely unstable
2424
# os: [ ubuntu-latest, windows-2019 ]
2525
os: [ubuntu-latest]
26-
node-version: [16, 18, 20]
26+
node-version: [16, 18, 20, 22]
2727

2828
steps:
2929
- name: Cancel Workflow Action
@@ -150,6 +150,9 @@ jobs:
150150
- name: ESLint
151151
run: yarn lint
152152

153+
- name: Biome format
154+
run: yarn format:check
155+
153156
release_next:
154157
name: Release @next
155158
if: github.event_name == 'push' && contains(github.event.ref, 'master') && (!contains(github.event.head_commit.message, '[skip ci]') && !contains(github.event.head_commit.message, 'docs:'))

.husky/pre-commit

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
yarn format

CHANGELOG.md

Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,53 @@
33
All notable changes to this project will be documented in this file.
44
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.
55

6+
## [3.10.1](https://github.com/apify/crawlee/compare/v3.10.0...v3.10.1) (2024-05-23)
7+
8+
9+
### Bug Fixes
10+
11+
* adjust `URL_NO_COMMAS_REGEX` regexp to allow single character hostnames ([#2492](https://github.com/apify/crawlee/issues/2492)) ([ec802e8](https://github.com/apify/crawlee/commit/ec802e85f54022616e5bdcc1a6fd1bd43e1b3ace)), closes [#2487](https://github.com/apify/crawlee/issues/2487)
12+
* investigate and temp fix for possible 0-concurrency bug in RQv2 ([#2494](https://github.com/apify/crawlee/issues/2494)) ([4ebe820](https://github.com/apify/crawlee/commit/4ebe820573b269c2d0a6eff20cfd7787debc63c0))
13+
* provide URLs to the error snapshot ([#2482](https://github.com/apify/crawlee/issues/2482)) ([7f64145](https://github.com/apify/crawlee/commit/7f64145308dfdb3909d4fcf945759a7d6344e2f5)), closes [/github.com/apify/apify-sdk-js/blob/master/packages/apify/src/key_value_store.ts#L25](https://github.com//github.com/apify/apify-sdk-js/blob/master/packages/apify/src/key_value_store.ts/issues/L25)
14+
15+
16+
17+
18+
19+
# [3.10.0](https://github.com/apify/crawlee/compare/v3.9.2...v3.10.0) (2024-05-16)
20+
21+
22+
### Bug Fixes
23+
24+
* `EnqueueStrategy.All` erroring with links using unsupported protocols ([#2389](https://github.com/apify/crawlee/issues/2389)) ([8db3908](https://github.com/apify/crawlee/commit/8db39080b7711ba3c27dff7fce1170ddb0ee3d05))
25+
* conversion between tough cookies and browser pool cookies ([#2443](https://github.com/apify/crawlee/issues/2443)) ([74f73ab](https://github.com/apify/crawlee/commit/74f73ab77a94ecd285d587b7b3532443deda07b4))
26+
* fire local `SystemInfo` events every second ([#2454](https://github.com/apify/crawlee/issues/2454)) ([1fa9a66](https://github.com/apify/crawlee/commit/1fa9a66388846505f84dcdea0393e7eaaebf84c3))
27+
* use createSessionFunction when loading Session from persisted state ([#2444](https://github.com/apify/crawlee/issues/2444)) ([3c56b4c](https://github.com/apify/crawlee/commit/3c56b4ca1efe327138aeb32c39dfd9dd67b6aceb))
28+
* do not drop statistics on migration/resurrection/resume ([#2462](https://github.com/apify/crawlee/issues/2462)) ([8ce7dd4](https://github.com/apify/crawlee/commit/8ce7dd4ae6a3718dac95e784a53bd5661c827edc))
29+
* double tier decrement in tiered proxy ([#2468](https://github.com/apify/crawlee/issues/2468)) ([3a8204b](https://github.com/apify/crawlee/commit/3a8204ba417936570ec5569dc4e4eceed79939c1))
30+
* Fixed double extension for screenshots ([#2419](https://github.com/apify/crawlee/issues/2419)) ([e8b39c4](https://github.com/apify/crawlee/commit/e8b39c41764726280c995e52fa7d79a9240d993e)), closes [#1980](https://github.com/apify/crawlee/issues/1980)
31+
* malformed sitemap url when sitemap index child contains querystring ([#2430](https://github.com/apify/crawlee/issues/2430)) ([e4cd41c](https://github.com/apify/crawlee/commit/e4cd41c49999af270fbe2476a61d92c8e3502463))
32+
* return true when robots.isAllowed returns undefined ([#2439](https://github.com/apify/crawlee/issues/2439)) ([6f541f8](https://github.com/apify/crawlee/commit/6f541f8c4ea9b1e94eb506383019397676fd79fe)), closes [#2437](https://github.com/apify/crawlee/issues/2437)
33+
* sitemap `content-type` check breaks on `content-type` parameters ([#2442](https://github.com/apify/crawlee/issues/2442)) ([db7d372](https://github.com/apify/crawlee/commit/db7d37256a49820e3e584165fff42377042ec258))
34+
35+
36+
### Features
37+
38+
* add `FileDownload` "crawler" ([#2435](https://github.com/apify/crawlee/issues/2435)) ([d73756b](https://github.com/apify/crawlee/commit/d73756bb225d9ed8f58cf0a3b2e0ce96f6188863))
39+
* implement ErrorSnapshotter for error context capture ([#2332](https://github.com/apify/crawlee/issues/2332)) ([e861dfd](https://github.com/apify/crawlee/commit/e861dfdb451ae32fb1e0c7749c6b59744654b303)), closes [#2280](https://github.com/apify/crawlee/issues/2280)
40+
* make `RequestQueue` v2 the default queue, see more on [Apify blog](https://blog.apify.com/new-apify-request-queue/) ([#2390](https://github.com/apify/crawlee/issues/2390)) ([41ae8ab](https://github.com/apify/crawlee/commit/41ae8abec1da811ae0750ac2d298e77c1e3b7b55)), closes [#2388](https://github.com/apify/crawlee/issues/2388)
41+
42+
43+
### Performance Improvements
44+
45+
* improve scaling based on memory ([#2459](https://github.com/apify/crawlee/issues/2459)) ([2d5d443](https://github.com/apify/crawlee/commit/2d5d443da5fa701b21aec003d4d84797882bc175))
46+
* optimize `RequestList` memory footprint ([#2466](https://github.com/apify/crawlee/issues/2466)) ([12210bd](https://github.com/apify/crawlee/commit/12210bd191b50c76ecca23ea18f3deda7b1517c6))
47+
* optimize adding large amount of requests via `crawler.addRequests()` ([#2456](https://github.com/apify/crawlee/issues/2456)) ([6da86a8](https://github.com/apify/crawlee/commit/6da86a85d848cd1cf860a28e5f077b8b14cdb213))
48+
49+
50+
51+
52+
653
## [3.9.2](https://github.com/apify/crawlee/compare/v3.9.1...v3.9.2) (2024-04-17)
754

855

biome.json

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
{
2+
"formatter": {
3+
"ignore": [
4+
"website/**",
5+
"packages/*/dist/**",
6+
"package.json",
7+
"scripts/actions/docker-images/state.json"
8+
],
9+
"formatWithErrors": true
10+
},
11+
"javascript": {
12+
"formatter": {
13+
"quoteStyle": "single",
14+
"semicolons": "always",
15+
"trailingComma": "all",
16+
"lineWidth": 120,
17+
"indentStyle": "space",
18+
"indentWidth": 4,
19+
"quoteProperties": "preserve",
20+
"lineEnding": "lf"
21+
}
22+
},
23+
"linter": {
24+
"enabled": false
25+
}
26+
}

docs/examples/.eslintrc.json

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
{
2-
"root": true,
3-
"extends": "@apify/ts",
4-
"parserOptions": {
5-
"project": "./tsconfig.eslint.json",
6-
"ecmaVersion": 2022
7-
},
8-
"rules": {
9-
"import/extensions": 0,
10-
"import/no-extraneous-dependencies": 0,
11-
"no-console": "off"
12-
}
2+
"root": true,
3+
"extends": "@apify/ts",
4+
"parserOptions": {
5+
"project": "./tsconfig.eslint.json",
6+
"ecmaVersion": 2022
7+
},
8+
"rules": {
9+
"import/extensions": 0,
10+
"import/no-extraneous-dependencies": 0,
11+
"no-console": "off"
12+
}
1313
}

docs/examples/cheerio_crawler.ts

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -57,8 +57,6 @@ const crawler = new CheerioCrawler({
5757
});
5858

5959
// Run the crawler and wait for it to finish.
60-
await crawler.run([
61-
'https://crawlee.dev',
62-
]);
60+
await crawler.run(['https://crawlee.dev']);
6361

6462
log.debug('Crawler finished.');

docs/examples/crawl_multiple_urls_cheerio.ts

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,4 @@ const crawler = new CheerioCrawler({
99
});
1010

1111
// Run the crawler with initial request
12-
await crawler.run([
13-
'http://www.example.com/page-1',
14-
'http://www.example.com/page-2',
15-
'http://www.example.com/page-3',
16-
]);
12+
await crawler.run(['http://www.example.com/page-1', 'http://www.example.com/page-2', 'http://www.example.com/page-3']);

0 commit comments

Comments
 (0)