By Brendan Sasso - 10/22/12 04:07 PM EDT
The FTC recommended that companies that use facial recognition technology should take steps to protect the personal data they collect, delete unnecessary personal data and consider the sensitivity of the information before collecting it.
For example, advertisers should not use the technology in bathrooms, locker rooms, healthcare facilities or areas with lots of children, the commission wrote.
Social networking sites should allow users to opt out of the technology, and advertisers placing cameras in public areas should provide clear notice that the technology is in use, the report concluded.
The FTC identified two scenarios in which companies should get affirmative consent before using facial recognition technology. The first situation is if the companies use the personal data in a different way than initially indicated, and the second situation is if they share the information with a third party who would not otherwise be able to identify the person. The commission also considered the possibility of a mobile app that could identify a person's face and retrieve other sensitive information, such as his or her address.
"Given the significant privacy and safety risks that such an app would raise, only consumers who have affirmatively chosen to participate in such a system should be identified," the FTC wrote.
The commission held a workshop late last year to consider the privacy implications of facial recognition technology and accepted comments from the public about how the technology should be used. The commission's report drew upon the workshop and comments.
The guidelines are not legally binding, but the commission could sue companies that promise to follow the recommendations in their privacy policies but then violate them.
The commission adopted the report in a 4 to 1 vote. Commissioner J. Thomas Rosch, a Republican, issued a dissenting statement questioning the legal justification of the recommendations and arguing that it is premature for the commission to pressure businesses to adopt safeguards against misconduct that might never occur.