Bemærk
Adgang til denne side kræver godkendelse. Du kan prøve at logge på eller ændre mapper.
Adgang til denne side kræver godkendelse. Du kan prøve at ændre mapper.
Returns the number of items in a group.
Syntax
from pyspark.sql import functions as sf
sf.count(col)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or column name |
Target column to compute on. |
Returns
pyspark.sql.Column: column for computed results.
Examples
Example 1: Count all rows in a DataFrame
from pyspark.sql import functions as sf
df = spark.createDataFrame([(None,), ("a",), ("b",), ("c",)], schema=["alphabets"])
df.select(sf.count(sf.expr("*"))).show()
+--------+
|count(1)|
+--------+
| 4|
+--------+
Example 2: Count non-null values in a specific column
from pyspark.sql import functions as sf
df.select(sf.count(df.alphabets)).show()
+----------------+
|count(alphabets)|
+----------------+
| 3|
+----------------+
Example 3: Count all rows in a DataFrame with multiple columns
from pyspark.sql import functions as sf
df = spark.createDataFrame(
[(1, "apple"), (2, "banana"), (3, None)], schema=["id", "fruit"])
df.select(sf.count(sf.expr("*"))).show()
+--------+
|count(1)|
+--------+
| 3|
+--------+
Example 4: Count non-null values in multiple columns
from pyspark.sql import functions as sf
df.select(sf.count(df.id), sf.count(df.fruit)).show()
+---------+------------+
|count(id)|count(fruit)|
+---------+------------+
| 3| 2|
+---------+------------+