pyspark.sql.functions.nvl#

pyspark.sql.functions.nvl(col1, col2)[source]#

Returns col2 if col1 is null, or col1 otherwise.

New in version 3.5.0.

Parameters
col1Column or str
col2Column or str

Examples

>>> df = spark.createDataFrame([(None, 8,), (1, 9,)], ["a", "b"])
>>> df.select(nvl(df.a, df.b).alias('r')).collect()
[Row(r=8), Row(r=1)]