In PySpark, DataFrame.fillna()
or DataFrameNaFunctions.fill()
is used to replace NULL/None values on all or selected multiple DataFrame columns with either zero(0), empty string, space, or any constant literal values.
While working on PySpark DataFrame we often need to replace null values since certain operations on null value return error hence, we need to graciously handle nulls as the first step before processing. Also, while writing to a file, it’s always best practice to replace null values, not doing this result nulls on the output file.
As part of the cleanup, sometimes you may need to Drop Rows with NULL/None Values in PySpark DataFrame and Filter Rows by checking IS NULL/NOT NULL conditions.
In this article, We will use both fill()
and fillna()
to replace null/none values with an empty string, constant value, and zero(0) on Dataframe columns integer, string with Python examples.
Before we start, Let’s read a CSV into PySpark DataFrame file, where we have no values on certain rows of String and Integer columns, PySpark assigns null values to these no value columns.
The file we are using here is available here small_zipcode.csv
This yields the below output. As you see columns type, city and population columns have null values.
Now, let’s see how to replace these null values.
PySpark fillna() & fill() Syntax
PySpark provides DataFrame.fillna() and DataFrameNaFunctions.fill() to replace NULL/None values. These two are aliases of each other and returns the same results.
- value – Value should be the data type of int, long, float, string, or dict. Value specified here will be replaced for NULL/None values.
- subset – This is optional, when used it should be the subset of the column names where you wanted to replace NULL/None values.
PySpark Replace Null/None Values with Zero
PySpark fill(value:Long)
signatures that are available in DataFrameNaFunctions
is used to replace NULL/None values with numeric values either zero(0) or any constant value for all integer and long datatype columns of PySpark DataFrame or Dataset.
Above both statements yields the same output, since we have just an integer column population
with null values Note that it replaces only Integer columns since our value is 0.
PySpark Replace Null/None Value with Empty String
Now let’s see how to replace NULL/None values with an empty string or any constant values String on all DataFrame String columns.
Yields below output. This replaces all String type columns with empty/blank string for all NULL values.
Now, let’s replace NULL’s on specific columns, below example replace column type
with empty string and column city
with value “unknown”.
Yields below output. This replaces null values with an empty string for type
column and replaces with a constant value “unknown
” for city
column.
Alternatively, you can also write the above statement as
Complete Code
Below is complete code with Scala example. You can use it by copying it from here or use the GitHub to download the source code.
Conclusion
In this PySpark article, you have learned how to replace Null/None values with zero or an empty string on integer and string columns respectively using fill()
and fillna()
transformation functions.