A team drawn from the BBC’s Technology Strategy and Architecture and Research & Development departments is now working, with a range of external technology and media partners, on a way of indelibly ‘marking’ content at the point it is published so that it can be identified wherever it ends up in that vast ecosystem we call the Internet.
Further detection techniques which would show where ‘marked’ content has been manipulated could then be added into the process. The idea is that these signals would be readable by both machine, so that automated actions can be taken to flag or even remove suspect content and by humans, journalists and our audiences. We’ve called this work, which is still at an early stage, ‘Project Origin’.
The technology needed to make it work is complex and multi-faceted, drawing on techniques such as watermarking, hashing and fingerprinting. A key challenge is that any signal needs to be robust enough to survive the many non-malicious things that can happen to a piece of content such as compression, resizing and so on.
Other issues include editorial considerations such as deciding which content to mark:
- If we only mark potentially ’sensitive’ content does this create problems when a ’signal’ cannot be found in other content, leaving it less valued because it is seen not to be genuine?
- What will marking content do to our workflows in terms of added effort and complexity?
The project the BBC is doing in this area sits alongside other work we and others are doing in the disinformation space. Examples are:
- the range of strong editorial content we are creating about the dangers of disinformation such as the ‘Beyond Fake News’ strand
- the work we are doing on media literacy and the partnerships we are building to collaborate with other media and technology organisations.
The BBC is a member of the global partnership on AI’s media integrity steering group which last year launched the Deepfake Detection Challenge with Facebook, AWS and others.
The Project Origin team currently aims to test its first solutions sometime this summer, building on these as we develop and strengthen our partnerships in this area. The eventual ambition is a system which is simple to use, transparent and with open standards that can be widely adopted for public good.
Source: Project Origin: Securing trust in a complex media landscape – BBC Academy