Q&A: Ethics and news automation22 juin 2021
[Transcription of an interview for research work for the Finnish Press Council (2020) ]
1. What are the different news automation applications in use today?
There are several kinds of news automation systems used nowadays in Europe. Here are few examples:
- Automated content as a final product or as a draft that journalists will enrich with their expertise. These contents might be textual or graphical.
- Alerts based on data variations when something important is happening, up to the journalists to define the newsworthiness.
- Automated fact-checking to determine the reliability of a source or information.
- Personalized content as used for election results. However, news companies also use another personalization type, driven by marketing, which focuses on the reader’s interest or navigation, but it is more about suggestions.
2. How widespread are these applications, especially in Europe? Can you generalize about what kind of media practitioners use them?
If there are more and more experiences in the field, it is still not widely used. Why? All those systems have a cost that not all news companies can afford. Media companies are pushed to innovate, as it would be the only possible answer to the economic crisis that they are facing. That is why news automation driven by marketing could have more chance to be more widespread in the future. Besides, there is also the question about the return on investment.
3. What ethical issues are, or maybe, associated with these different ways of automation?
In my PhD thesis, I proposed ten recommendations to encourage best practices in the context of news automation.
1. Automated news production should be considered, in any case, as support for journalism and not as a tool for cost reduction.
2. Any player in the technological world involved in the automated production of information must admit that from the moment he participates in the production chain journalistic, he acts as a journalist. Therefore, he must share its values in the prospect of a commonly shared framework where they share the same rights and duties as journalists. To promote and facilitate dialogs, journalists should be kept informed of the processes at work.
3. Journalists should also be involved in designing news automation artifacts to remain active players and share their professional values and know-how.
4. Data, the primary material for information, should be traceable. It should at least translate to the mention of the data producer. In all cases, the sources must be accurate, reliable, and up to date. Fact-checking procedures should be able to be implemented.
5. The structure of the stories should always be adapted to the types of data processed and to their application field to fit their end uses.
6. To guarantee automated production quality in time, these should be subject to regular human monitoring, both on the side data than that of automatically generated productions. This one must give rise to the operational management of errors, as well as to maintenance adapted.
7. Any automation activity based on the personalization of journalistic content according to the user’s profile should be explained and approved explicitly by the user.
8. Any procedure for automating journalistic content should be subject to preliminary tests with all the players concerned, even more so when a journalist does not rework automated content.
9. When an information production automation system is designed to support journalistic routines, the published contents should, in all cases, be subject to validation or human mediation.
10. Any automated news content should always mention the non-human nature of the content without any confusion.
4. What kind of problems may occur?
When automation comes from a marketing logic, journalists are often excluded from the process. They do not have to agree or not. The purpose of marketers is less to serve the « public good » than economic goals. Another problem that might arise is focusing more and more on news personalization, with the danger of not fulfilling the press’s root missions anymore.
5. How much are they taken into account in the development work of news automation?
As I said, journalists are not often associated with those processes. Another problem is that tech companies are not claiming themselves as news media, although they provide content. I think that is a danger: journalism has rights and duties that would have to be shared. It is a matter of social responsibility.
6. Are current guidelines sufficient?
I think that no because news automation brought new challenges, especially since non-professional journalists are involved in the process. The phenomenon might be seen as marginal, explaining why there is less interest in these considerations. The basic guidelines have to be applied but completed. We also know that when a technology is based on machine learning, it is more difficult to control the algorithmic paths.
7. If not, what kind of refinements could be needed?
I refer you to my ten recommendations, but I think that the most critical point is transparency. It is about the commitment to the audiences and the fact to keep journalists within the editorial chain. Journalism is a profession, and that must be repeated. It is not about technological performances but about serving the audiences with the most significant transparency and honesty.
8. Is self-regulatory guidance the right way to address the ethics-related issues that news automation most likely brings out and highlights?
It is not a new debate. It is not because there is self-regulation that all journalists follow the principles. That is the difference between self-moral and professional ethics. But that does not mean that good practices do not have to be encouraged, on the contrary! But I think that a new kind of regulation has to be set for non-journalistic actors. They have to separate their commercial activities from their news activities with a dedicated team. Several scholars and professionals engaged in a prospective vision would even be a sine qua non for journalists to adapt to technological innovations (Gynnild, 2014; Lindén, 2017). However, due to the cultural work differences between two distinct social worlds, we argue that « journalistic thinking » must also be promoted inside tech companies involved in editorial projects. Obviously, journalism cannot be reduced to the sum of its parts. It is also a process, know-how, and a commitment to the audiences.
9. Is there a risk that if the media industry does not regulate itself, someone else – legislators, the EU, platform companies – will do it sooner or later. Could it jeopardize press freedom?
This risk has always existed, but with AI, it could become more prominent.
10. Should the public be told about the use of automation?
Audiences have to be informed about it and, if possible, about the process behind news production. This transparency is essential as technology is often viewed as a black box. Journalists have also to be informed about it and have the right to participate in the design process. That means that journalism schools have to be aware of it, developing new forms of training as it is in the US. The phenomenon of news automation shows that the profession evolves and that new profiles are now required.
11. It is relatively easy to add a sentence saying a news robot automatically writes something like this article, but what about other « milder » forms of automation that are more clearly based on collaboration between journalists and algorithms? A phrase like the topic of this news story identified through machine learning would not open up the production process very well.
Yes, it has to be mentioned even if it is a man-machine collaboration. It is a matter of honesty. Machines are not bad or good. They are what humans do with it. Moreover, do not forget that humans will also embed their values and intentions behind any technological system. If they are biased, biased will be automated too.