pyspark.sql.Column.over¶
-
Column.
over
(window: WindowSpec) → Column[source]¶ Define a windowing column.
New in version 1.4.0.
Changed in version 3.4.0: Supports Spark Connect.
- Parameters
- window
WindowSpec
- window
- Returns
Examples
>>> from pyspark.sql import Window >>> window = ( ... Window.partitionBy("name") ... .orderBy("age") ... .rowsBetween(Window.unboundedPreceding, Window.currentRow) ... ) >>> from pyspark.sql.functions import rank, min, desc >>> df = spark.createDataFrame( ... [(2, "Alice"), (5, "Bob")], ["age", "name"]) >>> df.withColumn( ... "rank", rank().over(window) ... ).withColumn( ... "min", min('age').over(window) ... ).sort(desc("age")).show() +---+-----+----+---+ |age| name|rank|min| +---+-----+----+---+ | 5| Bob| 1| 5| | 2|Alice| 1| 2| +---+-----+----+---+