You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 23, 2021. It is now read-only.
When trying to get over 500 000 rows/day the error message pops-up: 0% Error: Server error: (500) Internal Server Error Internal error: There was an internal error
In the code, max.results = NULL
To solve the issue, I set max.results = 500 000 (per day) and it works
Although I have almost 1 000 000 rows / fetch.by = "week"
The message I get if 500 000 rows are downloaded is:
**_Warning messages:
1: Only 500000 observations out of 1000000 were obtained. Set max.results = NULL (default value) to get all results.
2: 1 failed to parse.
3: Data contains sampled data. Used 473168 sessions (86% of sessions).
toc()
524.15 sec elapsed_**
===
Question: is the any other more effective way to be able to download all 1 000 000 rows instead of just 500 000?
When trying to get over 500 000 rows/day the error message pops-up:
0% Error: Server error: (500) Internal Server Error Internal error: There was an internal error
In the code, max.results = NULL
To solve the issue, I set max.results = 500 000 (per day) and it works
Although I have almost 1 000 000 rows / fetch.by = "week"
The message I get if 500 000 rows are downloaded is:
**_Warning messages:
1: Only 500000 observations out of 1000000 were obtained. Set max.results = NULL (default value) to get all results.
2: 1 failed to parse.
3: Data contains sampled data. Used 473168 sessions (86% of sessions).
===
Question: is the any other more effective way to be able to download all 1 000 000 rows instead of just 500 000?