News automation: 10 recommendations to promote ethical practices

2 mai 2020

These recommendations are grounded by an academic literature review, interviews with specialists in journalistic ethics specialists, and guidelines published by practitioners. They should be considered inclusively since they involve journalists, the management of press companies, and social agents from the technological world. They can help to resolve points of divergence between social worlds whose views may be different about the way on how each other exercise their responsibility. They were published in my PhD thesis.


News automation can be used as a final product that will be delivered to the audiences without any journalistic mediation, or as a first draft that journalists will enrich with their expertise. These recommendations aim to provide guidelines to promote ethical practices, considering that news information is a public good that commits of all stakeholders to the audiences.

1. News content generation should be considered, in any case, as a tool to support journalism. It should never be developed from a logic of cost reduction that would affect the journalists.

2. Tech people involved in the development of news automation should admit that they are participating in an editorial chain production and that, somehow, that they perform a journalistic act. Therefore, they should share the same rights and duties as journalists, considering that they are also liable to the audiences. In order to facilitate the understanding of journalists, they should inform those about the logic of the implemented processes.

3. As much as possible, journalists should be involved in the design of news automation artifact, to remain active players as professionals and experts of editorial processes.

4. As raw material for automated news production, the data used should be traceable with, at least, the explicit mention of their provider. In all cases, data should be accurate, reliable, and up to date, considering that only quality data lead to quality content. Errors should be prevented. As much as possible, fact-checking procedures should also be implemented.

5. Following the fitness for use principle, the structure of the published stories should always be adapted to the types of data processed and to their application domain.

6. In order to guarantee the maintenance of quality over time, regular human monitoring should be organized. It concerns both the data and the automated productions, and it should be a part of a maintenance strategy that also includes error management.

7. Any automation activity that relies on strategies of news personalization should be explained and approved explicitly by the end-user.

8. Any news automation project should be subject to preliminary tests with all the stakeholders, audiences included, especially when the content generated is designed to be delivered as-is, without journalistic mediation.

9. When news automation is designed to support journalistic routines, the published contents should, in all cases, be subject to validation or human mediation.

10. Any automated content should always be presented as such it is, with the explicitly mentioning of the non-human nature of the author. This mention should not lead to any confusion, for instance, with only mentioning the name of the tech provider.

#