Content Moderation - Technology Guru


This role will supply vital knowledge of the architecture and standards of the Internet; the structure, systems and technologies employed by both large and small-scale online media platforms; and the harms that such platforms can present to consumers.

Team Overview

We are spending more of our time online - to work, learn, entertain ourselves and keep in touch with others.


However, being online has its risks. Ofcom research found a third of people feel the risks of being online have started to outweigh the benefits, while four in five adult internet users have concerns about going online, and most people support tighter rules.


This is why the Government has appointed Ofcom as the regulator for online harms in the UK.


Our role in upholding broadcasting standards for TV and radio programmes means we’ve gained extensive experience of protecting audiences from harm while upholding freedom of expression. An important part of our job will be to make sure online platforms do the same with their systems and processes.


Our new responsibilities mean we are increasing our capacity and capability in our online technology team and are looking to recruit talented and diverse specialists in this exciting and growing area.


Our technology Gurus will have the unique opportunity to directly impact a world leading regime, as well as have the chance to experiment with leading edge technologies. There has never been a more exciting and meaningful time to join.

Purpose of the Role

To be a true “guru” in the Online Content Moderation space. This role will supply vital knowledge of the architecture and standards of the Internet; the structure, systems and technologies employed by both large and small-scale online media platforms; and the harms that such platforms can present to consumers.

Requirements of the Role

Reporting to the Director of Online Technology Programme in a recently formed project team, responsibilities for this role include:  

  • Supporting the online technology team subject matter experts in aspects of Online Content Moderation.
  • Managing and supporting technical team projects to explore in detail aspects of the systems, platforms and standards which are of concern to Ofcom; prepare briefing documents and build the knowledge and understanding across Ofcom.  
  • As required work with government departments, industry, industry associations and internal Ofcom colleagues to ensure a ‘joined up’ approach to the technological aspect of our online regulation regime.   
  • Supporting practical work within subject matter experts to explore technological capabilities, and to develop innovative approaches and techniques to ensure Ofcom is at the forefront of technical developments.  
  • Seeking ways in which to improve our visibility of the online industry and its underlying data in order to be in line with best practice worldwide.  
  • Develop operational relationships with universities and other regulatory or government bodies active in the online safety sector where relevant.  
  • Working with our People and Transformation Team to support and deliver specialist training to colleagues and, more generally, to improve the capability of all colleagues in the organisation to understand online technologies.  
  • Working with our ICT team to ensure that the team’s activities are aligned with Ofcom’s ICT Strategy.  
  • Support Ofcom’s Public Policy Team to explore and implement efficient and secure ways for us to share data about online services with government and other policymakers.  

Skills, knowledge and experience

  • A practical understanding of the architecture, systems and operation of several online platforms and an understanding of the underlying Internet standards and how they are developed.  
  • Awareness of policy debates and the relevant technologies associated that will fall within Ofcom’s regulatory instruction.  
  • Content moderation systems, including the role of Artificial Intelligence techniques - including an awareness of input and output reliability. 
  • Age verification systems.
  • Filtering, blocking, and flagging of harmful content.
  • How users interact with content management and how recommendations of action are managed.
  • Demonstrable knowledge of automation (bots) including the positive and negative uses of bots as well as bot defense technologies (e.g. against DDoS) within platforms.
  • A thorough understanding of the underlying business models that drive the moderation and recommendation algorithms/bots used by platforms.
  • Knowledge and awareness of the market and developments in underlying technologies used by third-party safety tech vendors e.g. big data, AI (Artificial Intelligence), attack vectors etc


We know that Online Content Moderation Gurus will have varying experience and backgrounds, we are open to all. Please get in touch to find out how to be part of this journey.


Ofcom will be opening a new tech hub in Manchester this year and we are looking to expand the team across our London office and our new Manchester location.

Join the CW jobs mailing list

This site uses cookies.

We use cookies to help us to improve our site and they enable us to deliver the best possible service and customer experience. By clicking accept or continuing to use this site you are agreeing to our cookies policy. Learn more

Start typing and press enter or the magnifying glass to search

Sign up to our newsletter
Stay in touch with CW

Choosing to join an existing organisation means that you'll need to be approved before your registration is complete. You'll be notified by email when your request has been accepted.

Your password must be at least 8 characters long and contain at least 1 uppercase character, 1 lowercase character and at least 1 number.

I would like to subscribe to

Select at least one option*