I try to add to a df a column with an empty array of arrays of strings, but I end up adding a column of arrays of strings.
I tried this:
import pyspark.sql.functions as F
df = df.withColumn('newCol', F.array([]))
How can I do this in pyspark?
I try to add to a df a column with an empty array of arrays of strings, but I end up adding a column of arrays of strings.
I tried this:
import pyspark.sql.functions as F
df = df.withColumn('newCol', F.array([]))
How can I do this in pyspark?
This is one of the way:
>>> import pyspark.sql.functions as F
>>> myList = [('Alice', 1)]
>>> df = spark.createDataFrame(myList)
>>> df.schema
StructType(List(StructField(_1,StringType,true),StructField(_2,LongType,true)))
>>> df = df.withColumn('temp', F.array()).withColumn("newCol", F.array("temp")).drop("temp")
>>> df.schema
StructType(List(StructField(_1,StringType,true),StructField(_2,LongType,true),StructField(newCol,ArrayType(ArrayType(StringType,false),false),false)))
>>> df
DataFrame[_1: string, _2: bigint, newCol: array<array<string>>]
>>> df.collect()
[Row(_1=u'Alice', _2=1, newCol=[[]])]
更多推荐
所有评论(0)