Skip to content

DB Writer

Bases: FrozenModel

Class specifies schema and table where you can write your dataframe. support hooks

Added in 0.1.0

Changed in 0.8.0

Moved onetl.core.DBReaderonetl.db.DBReader

Parameters:

  • connection (DBConnection) –

    Class which contains DB connection properties. See DB Connections section.

  • target (str) –

    Table/collection/etc name to write data to.

    If connection has schema support, you need to specify the full name of the source including the schema, e.g. schema.name.

    Changed in 0.7.0

    Renamed tabletarget

  • options (dict | WriteOptions | None, default: None ) –

    Spark write options. Can be in form of special WriteOptions object or a dict.

    For example: {"if_exists": "replace_entire_table", "compression": "snappy"} or Hive.WriteOptions(if_exists="replace_entire_table", compression="snappy")

    Note

    Some sources does not support writing options.

Examples:

from onetl.connection import Postgres
from onetl.db import DBWriter

postgres = Postgres(...)

writer = DBWriter(
    connection=postgres,
    target="fiddle.dummy",
)
from onetl.connection import Postgres
from onetl.db import DBWriter

postgres = Postgres(...)

options = Postgres.WriteOptions(if_exists="replace_entire_table", batchsize=1000)

writer = DBWriter(
    connection=postgres,
    target="fiddle.dummy",
    options=options,
)

run(df)

Method for writing your df to specified target. support hooks

Note

Method does support only batching DataFrames.

Added in 0.1.0

Parameters:

  • df (DataFrame) –

    Spark dataframe

Examples:

Write dataframe to target:

writer.run(df)