Like our blogs?

Join our newsletter and get more blogs and news

Great Expectations Newsletter and Updates Sign-up

Hello friend of Great Expectations!

Our newsletter content will feature product updates from the open-source platform and our upcoming Cloud product, new blogs and community celebrations.

Please let us know what matters to you in regards to your use (or potential use) of Great Expectations below. We want to make sure that we keep you informed and notified only about what matters to you.

Error message placeholder

Error message placeholder

Error message placeholder

Error message placeholder

Error message placeholder

Error message placeholder

Error message placeholder

Error message placeholder

Specify nonstandard delimiters for CSVs in Great Expectations

Great Expectations offers you direct access to the reader methods/options of your Execution engine via the batch_spec_passthrough parameter.
Written By  Austin RobinsonNovember 15, 2022
Black and white photo of the semicolon key on a typewriter
It’s just a simple parameter adjustment for GX to handle CSVs where the delimiter isn’t actually a comma. (📸: Connor Pope via Unsplash)

Great Expectations can use the Pandas and SparkEngines to access and validate your CSV file data. To do this, you query your data by providing a Batch Request containing all the necessary details to return the expected Batch of data using your Datasource configuration. 

For many files, the data is indeed separated by commas, and you can proceed with your normal GX workflow.

Somewhat frequently, though, supposedly comma-separated values are in fact separated by something else. In that case, parsing expecting a comma delimiter will generally concatenate all columns into a single column: every row’s values become a single entry in that row.

You can adjust the delimiters of a file using the reader methods and options of your Execution Engine. Great Expectations provides the batch_spec_passthrough parameter within a Batch Request to offer you direct access to those reader methods & options.

To tell Great Expectations how to handle CSVs with non-comma delimiters, simply pass the reader_options appropriate to your file into the batch_spec_passthrough parameter of your Batch Request.

batch_request = RuntimeBatchRequest(
  datasource_name="my_filesystem_datasource",
  data_connector_name="default_runtime_data_connector_name",
  data_asset_name="example_data_asset",
  runtime_parameters={"path": "path/to/data.csv"},
  batch_identifiers={"default_identifier_name": 1234567890},
  batch_spec_passthrough={"reader_options": {"sep": ";"}},
)


Note that you can pass any reader options supported by the execution engine you’re using (Pandas or Spark). 

For example, a batch_spec_passthrough setting a semicolon as the delimiter and interpreting blank lines as null values, read with the Pandas Execution Engine, might look something like:

batch_request = RuntimeBatchRequest(
  datasource_name="my_filesystem_datasource",
  data_connector_name="default_runtime_data_connector_name",
  data_asset_name="example_data_asset",
  runtime_parameters={"path": "path/to/data.csv"},
  batch_identifiers={"default_identifier_name": 1234567890},
  batch_spec_passthrough={"reader_options": {"sep": ";", "skip_blank_lines":
False}},
)


A similar process with the Spark Execution Engine, with tabs as the delimiter, might look like:

batch_request = RuntimeBatchRequest(
  datasource_name="my_filesystem_datasource",
  data_connector_name="default_runtime_data_connector_name",
  data_asset_name="example_data_asset",
  runtime_parameters={"path": "path/to/data.csv"},
  batch_identifiers={"default_identifier_name": 1234567890},
  batch_spec_passthrough={"reader_options": {"delimiter": "/t", "mode":
"PERMISSIVE"}},
)


For more on your options with batch_spec_passthrough, check out the Pandas pd.read_csv() documentation here, and the Spark DataFrameReader documentation here.



Great Expectations is part of an increasingly flexible and powerful modern data ecosystem. This is just one example of the ways in which Great Expectations is able to leverage that ecosystem to give you greater control of your data quality processes.

We’re committed to supporting and growing the community around Great Expectations. It’s not enough to build a great platform; we want to build a great community as well. Join our public Slack channel here, find us on GitHub, sign up for one of our weekly cloud workshops, or head to https://greatexpectations.io/ to learn more.

We are hiring! Please check out our job board here:

Great Expectations

Developed By

Netlify Logo
Brought to you by the Superconductive TeamCopyright ©2020 Great Expectations