Native Delta Lake implementation in Rust
let table = deltalake::open_table("./tests/data/simple_table").await.unwrap();
println!("{}", table.get_files());
❯ cargo run --bin delta-inspect files ./tests/data/delta-0.2.0
part-00000-cb6b150b-30b8-4662-ad28-ff32ddab96d2-c000.snappy.parquet
part-00000-7c2deba3-1994-4fb8-bc07-d46c948aa415-c000.snappy.parquet
part-00001-c373a5bd-85f0-4758-815e-7eb62007a15c-c000.snappy.parquet
❯ cargo run --bin delta-inspect info ./tests/data/delta-0.2.0
DeltaTable(./tests/data/delta-0.2.0)
version: 3
metadata: GUID=22ef18ba-191c-4c36-a606-3dad5cdf3830, name=None, description=None, partitionColumns=[], createdTime=1564524294376, configuration={}
min_version: read=1, write=2
files count: 3
The examples folder shows how to use Rust API to manipulate Delta tables.
Examples can be run using the cargo run --example
command. For example:
cargo run --example read_delta_table
s3
- enable the S3 storage backend to work with Delta Tables in AWS S3.s3-rustls
- enable the S3 storage backend but rely on rustls rather than OpenSSL (native-tls
).glue
- enable the Glue data catalog to work with Delta Tables with AWS Glue.azure
- enable the Azure storage backend to work with Delta Tables in Azure Data Lake Storage Gen2 accounts.gcs
- enable the Google storage backend to work with Delta Tables in Google Cloud Storage.datafusion-ext
- enable thedatafusion::datasource::TableProvider
trait implementation for Delta Tables, allowing them to be queried using DataFusion.
To run s3 integration tests from local machine, we use docker-compose to stand
up AWS local stack. To spin up the test environment run docker-compose up
in
the root of the delta-rs
repo.