Skip to content

Commit 0fe4a0e

Browse files
Update faq.md (#1091)
1 parent 619cc36 commit 0fe4a0e

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

docs/faq.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,11 +14,11 @@ The HTTP Archive crawls [millions of URLs](https://httparchive.org/reports/state
1414

1515
## How is the data gathered?
1616

17-
The list of URLs is fed to our private instance of [WebPageTest](https://webpagetest.org) on the 1st of each month.
17+
The list of URLs is fed to our private instance of [WebPageTest](https://webpagetest.org) on the 2nd Tuesday of each month.
1818

1919
As of March 1 2016, the tests are performed on Chrome for desktop and emulated Android (on Chrome) for mobile.
2020

21-
The test agents are run from [Google Cloud regions](https://cloud.google.com/compute/docs/regions-zones) across the US. Each URL is loaded once with an empty cache ("first view") for normal metrics collection and again, in a clean browser profile, using [Lighthouse](https://developers.google.com/web/tools/lighthouse). The data is collected via a [HAR file](https://en.wikipedia.org/wiki/.har). The HTTP Archive collects these HAR files, parses them, and populates various tables in BigQuery.
21+
The test agents are run from [Google Cloud regions](https://cloud.google.com/compute/docs/regions-zones) across the US. Each URL is loaded once with an empty cache ("first view") for normal metrics collection and again, in a clean browser profile, using [Lighthouse](https://developers.google.com/web/tools/lighthouse). The data is collected via a [HAR file](https://en.wikipedia.org/wiki/.har). The HTTP Archive collects these HAR files, parses them, and populates [a public dataset in BigQuery](https://har.fyi/guides/getting-started/).
2222

2323

2424
## How accurate is the data, in particular the time measurements?

0 commit comments

Comments
 (0)