Bemærk
Adgang til denne side kræver godkendelse. Du kan prøve at logge på eller ændre mapper.
Adgang til denne side kræver godkendelse. Du kan prøve at ændre mapper.
Computes the first argument into a string from a binary using the provided character set (one of 'US-ASCII', 'ISO-8859-1', 'UTF-8', 'UTF-16BE', 'UTF-16LE', 'UTF-16', 'UTF-32').
For the corresponding Databricks SQL function, see decode (key) function.
Syntax
from pyspark.databricks.sql import functions as dbf
dbf.decode(col=<col>, charset=<charset>)
Parameters
| Parameter | Type | Description |
|---|---|---|
col |
pyspark.sql.Column or str |
target column to work on. |
charset |
literal string |
charset to use to decode to. |
Returns
pyspark.sql.Column: the column for computed results.
Examples
from pyspark.databricks.sql import functions as dbf
df = spark.createDataFrame([(b"\x61\x62\x63\x64",)], ["a"])
df.select("*", dbf.decode("a", "UTF-8")).show()