Build structures (table name, column names, and quoting strategy) needed to represent data from a remote table.
db_td(db, table_name, ..., qualifiers = NULL, limit_was = 6L) dbi_table(db, table_name, ..., qualifiers = NULL, limit_was = 6L)
db | database connection |
---|---|
table_name | name of table |
... | not used, force later argument to bind by name |
qualifiers | optional named ordered vector of strings carrying additional db hierarchy terms, such as schema. |
limit_was | optional, row limit used to produce head_sample. If NULL no head_sample is produced and rq_colnames is used to get column names. |
a relop representation of the data
Note: in examples we use rq_copy_to()
to create data. This is only for the purpose of having
easy portable examples. With big data the data is usually already in the remote database or
Spark system. The task is almost always to connect and work with this pre-existing remote data
and the method to do this is db_td
which builds a reference to a remote table given the table name.
dbi_table
: old name for db_td
if (requireNamespace("DBI", quietly = TRUE) && requireNamespace("RSQLite", quietly = TRUE)) { my_db <- DBI::dbConnect(RSQLite::SQLite(), ":memory:") rq_copy_to(my_db, 'd', data.frame(AUC = 0.6, R2 = 0.2), overwrite = TRUE, temporary = TRUE) d <- db_td(my_db, 'd') print(d) sql <- to_sql(d, my_db) cat(sql) print(DBI::dbGetQuery(my_db, sql)) cols <- columns_used(d) print(cols) sql2 <- to_sql(d, my_db, using = "AUC") cat(sql2) print(DBI::dbGetQuery(my_db, sql2)) DBI::dbDisconnect(my_db) }#> [1] "mk_td(\"d\", c( \"AUC\", \"R2\"))" #> AUC R2 #> 1 0.6 0.2 #> SELECT #> `AUC`, #> `R2` #> FROM #> `d` #> AUC R2 #> 1 0.6 0.2 #> $d #> [1] "AUC" "R2" #> #> SELECT #> `AUC` #> FROM #> `d` #> AUC #> 1 0.6