Slaves had registered, but cannot pass work to slave. (Standalone)
If I open all of the Inbound TCP port, it can work.
But I cannot do it, because it is about security.
2018-06-04 13:22:44 INFO DAGScheduler:54 - Submitting 100 missing tasks from ResultStage 0 (PythonRDD[1] at reduce at /usr/bin/spark/examples/src/main/python/pi.py:44) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14))
2018-06-04 13:22:44 INFO TaskSchedulerImpl:54 - Adding task set 0.0 with 100 tasks
2018-06-04 13:22:59 WARN TaskSchedulerImpl:66 - Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources