In iOS, the AVMetadataFaceObject in AVFoundation is used for in vivo detection, and the information of shaking the head of the user can not be obtained.

if you want to carry out simple live detection, you need to get the information of shaking your head through the camera. You want to achieve this through yawAngle in AVMetadataFaceObject. Now you find that if the face is already in the lens when [session startRunning], no matter how you shake your head at this time, yawAngle is always equal to 0, but you just need to enter the camera after the face is removed. Is this BUG? How to solve it?

Oct.10,2021
MySQL Query : SELECT * FROM `codeshelper`.`v9_news` WHERE status=99 AND catid='6' ORDER BY rand() LIMIT 5
MySQL Error : Disk full (/tmp/#sql-temptable-64f5-1e35b26-588de.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
MySQL Errno : 1021
Message : Disk full (/tmp/#sql-temptable-64f5-1e35b26-588de.MAI); waiting for someone to free some space... (errno: 28 "No space left on device")
Need Help?