A picture might be worth a thousand words, but for businesses, it could also cost significant legal, reputational and financial damage. This is why Image Content Analysis is important.
Downloading inappropriate NSFW images and video is a very real concern for organisations in a world where work and home devices commonly overlap. Too often, employees put the organisations they work for at risk by carrying out personal activities on company hardware.
In fact, any employee who downloads illegal images onto a work laptop immediately puts company Directors and Officers at direct risk of legal action as they are vicariously liable.
As with many information security problems, with vast swathes of the working population now at home, this is an issue amplified by the global pandemic.
Strangely, however, monitoring remote worker activity is perhaps not a priority for many information security professionals, who are probably more distracted by the dizzying array of premeditated cyber-attacks such as Business Email Compromise.
This doesn’t mean it’s not a problem. A survey carried out by a leading UK insurance provider found that 65% of people use their work device for non-work related activities, 1.5m people confessed directly to using them to look at pornography. This is 5% of the entire national workforce.
It’s not just adult content which can present a risk to companies, either. The Internet opens up the full gamut of offensive imagery to work devices, extremist and terrorist videos, for example.
The impact of the issue lies in the fact that, in most countries, employers are liable for the actions of their employees and need to demonstrate they have taken all reasonable steps to protect their people.
Put simply, this means businesses could face harassment and offensive content claims and even criminal charges, if illegal content is involved, unless they take proactive steps to mitigate the problem.
Legal considerations aside, employees caught downloading inappropriate imagery also present a reputational issue with the potential to drag a company’s carefully managed brand through the mud.
How can I address this issue?
It is obviously impossible to manually police every image and video which crosses the network perimeter and education of employees will only ever have a certain hit rate.
Image Content Analysis (ICA) automates away the problem. Typically integrated into a secure web security solution, ICA protects employees from images which are deemed inappropriate by filtering the final portion of a web request prior to the content being downloaded.
A contemporary web security engine will be powered by AI and deep learning to understand the content of the image or video in context, ensuring a high level of accuracy by removing false positives. Typically, they can be set to filter at different sensitivity levels to support pre-defined policies, while also providing an audit trail to ensure compliance.
Progressive ICA software is also adaptable enough to help protect the SaaS environment. The same technology can be deployed at other points where pictures and video can enter an organisation, on cloud email solutions or even active in-line with CASB for example, to allow for protection from images and videos in applications such as Box, Google Drive and OneDrive.
Security teams keen to take a big picture view of risk will ensure they factor ICA into their posture, primarily as a way of lessening the spectre of litigation but also to fulfil their duty of care, and as a hedge against reputational issues.
To learn more about how to protect your business from inappropriate content, discover Censornet’s Web Security solution.