used sparkSQL to write a UDF, to calculate commission based on performance, but now the demand has changed. The percentage of commission is determined according to position and department. Here"s an example
spark.udf.register("mmyjtc", (yj: Float) => {
(yj) match {
case v if (v <= 6000) => {
v * 0.10
}
case v if (v > 6000 && v <= 10000) => {
v * 0.13
}
case v if (v > 10000 && v <= 20000) => {
v * 0.20
}
case v => -1.0
}
})
this is the upper and lower bounds of the previous UDF, now (0pc6000], (6000d10, 000), (10000) and 20000]. The upper and lower bounds of these intervals need to be obtained through SQL query, and the commission ratio of 0.10, 0.13, 0.20 also needs to be obtained through SQL query. Can UDF still be used? Or is there any alternative?