Title: | An Interface to Google's 'BigQuery Storage' API |
---|---|
Description: | Easily talk to Google's 'BigQuery Storage' API from R (<https://cloud.google.com/bigquery/docs/reference/storage/rpc>). |
Authors: | Bruno Tremblay [aut, cre], Google LLC [cph, fnd] |
Maintainer: | Bruno Tremblay <[email protected]> |
License: | Apache License (>= 2) |
Version: | 1.2.1 |
Built: | 2024-11-17 06:55:43 UTC |
Source: | CRAN |
Easily talk to Google's 'BigQuery Storage' API from R (https://cloud.google.com/bigquery/docs/reference/storage/rpc).
Maintainer: Bruno Tremblay [email protected]
Other contributors:
Google LLC [copyright holder, funder]
Useful links:
Report bugs at https://github.com/meztez/bigrquerystorage/issues
Initialize bigrquerystorage client
Close bigrquerystorage client
bqs_auth() bqs_deauth()
bqs_auth() bqs_deauth()
Will attempt to reuse bigrquery
credentials.
About Credentials
If your application runs inside a Google Cloud environment that has a default service account, your application can retrieve the service account credentials to call Google Cloud APIs. Such environments include Compute Engine, Google Kubernetes Engine, App Engine, Cloud Run, and Cloud Functions. We recommend using this strategy because it is more convenient and secure than manually passing credentials.
Additionally, we recommend you use Google Cloud Client Libraries for your application. Google Cloud Client Libraries use a library called Application Default Credentials (ADC) to automatically find your service account credentials. ADC looks for service account credentials in the following order:
If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set, ADC uses the service account file that the variable points to.
If the environment variable GOOGLE_APPLICATION_CREDENTIALS isn't set, ADC uses the default service account that Compute Engine, Google Kubernetes Engine, App Engine, Cloud Run, and Cloud Functions provide.
If ADC can't use either of the above credentials, an error occurs.
No return value, called for side effects.
This retrieves rows block in a stream using a grpc protocol. It is most suitable for results of larger queries (>100 MB, say).
bqs_table_download( x, parent = getOption("bigquerystorage.project", ""), snapshot_time = NA, selected_fields = character(), row_restriction = "", sample_percentage, n_max = Inf, quiet = NA, as_tibble = lifecycle::deprecated(), bigint = c("integer", "integer64", "numeric", "character"), max_results = lifecycle::deprecated() )
bqs_table_download( x, parent = getOption("bigquerystorage.project", ""), snapshot_time = NA, selected_fields = character(), row_restriction = "", sample_percentage, n_max = Inf, quiet = NA, as_tibble = lifecycle::deprecated(), bigint = c("integer", "integer64", "numeric", "character"), max_results = lifecycle::deprecated() )
x |
Table reference |
parent |
Used as parent for |
snapshot_time |
Table modifier |
selected_fields |
Table read option |
row_restriction |
Table read option |
sample_percentage |
Table read option |
n_max |
Maximum number of results to retrieve. Use |
quiet |
Should information be printed to console. |
as_tibble |
Should data be returned as tibble. Default (FALSE) is to return as arrow Table from raw IPC stream. |
bigint |
The R type that BigQuery's 64-bit integer types should be mapped to.
The default is |
max_results |
Deprecated |
More details about table modifiers and table options are available from the API Reference documentation. (See TableModifiers and TableReadOptions)
This method returns a data.frame or optionally a tibble.
If you need a data.frame
, leave parameter as_tibble to FALSE and coerce
the results with as.data.frame()
.