How to serialize a collection of objects to RDD? in pySpark?

how to serialize an object collection to RDD? in pySpark?

for example: the simplest operation
class test:

data = 1
def __init__(self):
    self.property=0

def test2(self):
    print("hello")


if name ="_ _ main__":

p1 = test()
p2 = test()
a = [p1,p2]
sc = SparkContext("local[2]", "test" )
rdd = sc.parallelize(a)
rdd.map(lambda x : x.property ).collect()
sc.stop()
Sep.23,2021
MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-1bee801-31bf8.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-1bee801-31bf8.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?