Image Analyzer 6.0: Market Leader in Explicit Image and Video Scanning Launches New Enhanced Technology
United Kingdom (PRWEB) April 22, 2015 -- Image Analyzer, the leader in explicit image and video detection, has an enhanced version of their groundbreaking technology. Image Analyzer 6.0 has the potential to revolutionize the content moderation sector. The latest version of the technology has the ability to identify up to 99 percent of sexually explicit images and videos. The adjustable engine sensitivity allows users to reduce moderation queues by 50 to 95 percent, depending on their preferred levels of risk.
"The primary goal of Image Analyzer 6.0 was to improve the accuracy of the technology, with a particular focus on reducing false positives when the technology is set on a very aggressive setting," said Stephen Tye, Product Manager at Image Analyzer. "The second goal was to increase detection beyond what was possible in the previous version. This focus was specifically taken to meet the needs of the content moderation market, which requires very high detection rates of explicit content."
Image Analyzer 6.0 produces up to 60 percent less false positives for the same level of detection as the previous version. This was achieved by optimizing the skin detection algorithm, widening the range of tones to better detect amateur pornography, which is often produced in less than ideal lighting conditions. Improved segmentation separates the foreground images from the background ensuring objects of interest are evaluated in the correct context. Considerable time was also spent enhancing face detection to cater for social images such as selfies, which are so prevalent on social networks and media sharing sites.
Moderating explicit content is an ongoing battle for sites that depend on user generated content to grow and thrive. Many of these companies rely solely on human moderators to review uploaded content. While human moderation is effective, it is not a scalable solution. As social networking sites gain popularity, the volume of user generated content can become unmanageable. Companies are then faced with an important decision. Should they risk degrading the user experience with a less-than-optimal content moderation process, or hire a larger team of human reviewers? Both options come at a cost.
Image Analyzer 6.0 is a scalable, cost-effective solution that simplifies the content moderation process by accurately identifying pornographic material. Once deployed, the technology scans uploaded images and videos, and then places only high risk content into the queue for human review, significantly reducing the load on the moderation team. To give an example, if a social media company had 1,000,000 images to moderate each day, they would need to employ a team of approximately 20 human moderators working 10-hour shifts. With Image Analyzer 6.0 deployed, the moderation queue could be reduced to just 50,000 images per day. This could be reviewed by a single human moderator over a 10-hour shift, which translates to an immediate and measurable cost savings for the site provider.
Whilst existing Image Analyzer clients will also benefit from the enhanced version of the technology, one of the significant new areas of deployment will be in threat analysis. The new version of Image Analyzer can be added as part of a back office technology stack to analyze still image, static video and streaming video visual content and determine whether it poses a threat to organizations.
For additional information about Image Analyzer’s photo and video scanning technology and associated patents, please contact:
info(at)image-analyzer.com
About Image Analyzer:
Image Analyzer is the market leading solution for detecting sexually explicit image content. The technology can quickly and accurately analyze an image or video to determine if it contains pornography. Image Analyzer is licensed on an OEM basis to software vendors and service providers across a broad range of market sectors.
Crispin Pikes, Image Analyzer, http://www.image-analyzer.com, +44 870 041 1166, [email protected]
Share this article