Skip to content
This repository was archived by the owner on Sep 20, 2023. It is now read-only.

Commit 6b610aa

Browse files
Update README
1 parent 2a1a26e commit 6b610aa

File tree

1 file changed

+31
-1
lines changed

1 file changed

+31
-1
lines changed

README.rst

Lines changed: 31 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,6 @@
1313
.. image:: https://pepy.tech/badge/pyathenajdbc/month
1414
:target: https://pepy.tech/project/pyathenajdbc/month
1515

16-
1716
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
1817
:target: https://github.com/psf/black
1918

@@ -309,6 +308,37 @@ The ``pyathena.util`` package also has helper methods.
309308
.. _`pandas.read_sql`: https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.read_sql.html
310309
.. _`DataFrame object`: https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.html
311310

311+
To SQL
312+
^^^^^^
313+
314+
You can use `pandas.DataFrame.to_sql`_ to write records stored in DataFrame to Amazon Athena.
315+
`pandas.DataFrame.to_sql`_ uses `SQLAlchemy`_, so you need to install it.
316+
317+
.. code:: python
318+
import pandas as pd
319+
from urllib.parse import quote_plus
320+
from sqlalchemy import create_engine
321+
conn_str = 'awsathena+jdbc://:@athena.{region_name}.amazonaws.com:443/'\
322+
'{schema_name}?s3_staging_dir={s3_staging_dir}&s3_dir={s3_dir}&compression=snappy'
323+
engine = create_engine(conn_str.format(
324+
region_name='us-west-2',
325+
schema_name='YOUR_SCHEMA',
326+
s3_staging_dir=quote_plus('s3://YOUR_S3_BUCKET/path/to/'),
327+
s3_dir=quote_plus('s3://YOUR_S3_BUCKET/path/to/')))
328+
df = pd.DataFrame({'a': [1, 2, 3, 4, 5]})
329+
df.to_sql('YOUR_TABLE', engine, schema="YOUR_SCHEMA", index=False, if_exists='replace', method='multi')
330+
331+
The location of the Amazon S3 table is specified by the ``s3_dir`` parameter in the connection string.
332+
If ``s3_dir`` is not specified, ``s3_staging_dir`` parameter will be used. The following rules apply.
333+
334+
.. code:: text
335+
336+
s3://{s3_dir or s3_staging_dir}/{schema}/{table}/
337+
338+
The data format only supports Parquet. The compression format is specified by the ``compression`` parameter in the connection string.
339+
340+
.. _`pandas.DataFrame.to_sql`: https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.to_sql.html
341+
312342
Credential
313343
----------
314344

0 commit comments

Comments
 (0)