AI and the march of the machines
4 January 2018
For the avoidance of doubt, I would like to make it clear from the outset that this article has been written by a genuine, bona fide human being and not by a robot.
Whilst this comment may appear to be no more than a slightly glib opening to a piece on artificial intelligence (AI), it is actually rooted in fact. In October the UK’s Press Association announced that it will be following in the footsteps of US news giant, Associated Press, by recruiting robot reporters to produce news content within the next few months.
The use of natural language generation engines to produce news content neatly encapsulates many of the current controversies linked to the increasing use of AI. In particular, it highlights many of the issues that surround the potential impact of AI on traditional job roles.
Proponents of the new technology argue that news content can now be generated with much greater efficiency and accuracy. A single bot named Xiaomingbot produced over 450 articles on the 2016 Olympics for Chinese news syndication service, Toutiao, whilst the Washington Post relied heavily on in-house automated storyteller, Heliograf, for its own coverage of the 15 day event. Associated Press has been able to achieve a twelve-fold increase in its quarterly earnings reporting using Automated Insights’ Wordsmith programme, and has stated that this has freed up the equivalent of three full-time employees across the organisation. In 2015, Automated Insights itself ‘wrote’ a total of 1.5 billion individual articles. Not bad for a company with only 50 employees.
So, what has happened to those journalists whose time would previously have been spent in producing content that is now automated? Associated Press has stated that automation has not displaced any reporters but has instead allowed them to re-direct their focus, to think more critically about the bigger picture and produce content which examines the nuances behind the numbers. Put bluntly, the humans can get on with producing quality, insightful journalism whilst the computers take care of the drudge work. In the short term, this looks like a ‘win-win’ situation but one has to question whether this is a sustainable model in the longer term.
As the use of automated content becomes standard industry practice, and more professional reporters produce an increasing volume of quality content, will we reach a saturation point beyond which there is simply no further demand or outlet for high quality human-generated content? This saturation point would, for most professional journalists, no doubt represent the tipping point – the moment at which AI ceases to be a positive resource to be relied on, and instead becomes, at best, a career redefining hurdle and, at worst, a career destroying hindrance. And to what extent will this situation be further exacerbated by continuing technological development?
As consumers of automatically generated news coverage, we are generally entirely unaware that the article we are reading has not been written by a human. Reg Chua, Executive Editor at Thompson Reuters (another proponent of the technology) reports that in blind testing, automated content actually came out as more readable than human-generated content. Whilst this is testament to the high standard of the existing technology, there is no dispute that, at present, the scope of that technology is limited.
The current generation of robot reporters are, in fact, simply software programmes that can process specific data sets to generate a fairly narrow range of essentially standardised reports on topics such as sports and finance. Those at the forefront of the technology are confident that, in time, AI will be able to produce increasingly complex ‘human’ content. Kris Hammond of Narrative Science predicts that “a machine will win a Pulitzer one day”. However, it is likely that there will be significant technological hurdles to overcome before we see a wholesale replacement of human journalists by cyber-hacks.
Concerned journalists may take some comfort from the recent experience of Microsoft’s chatbot, Tay. Tay launched on Twitter earlier this year and was taken offline within 24 hours after posting a variety of tweets containing racist and sexist content, promoting drug taking and denying the Holocaust. It subsequently made a brief return only to meltdown and start tweeting out of control, spamming its 210,000 followers with messages reading, somewhat ironically, “you are too fast, please take a rest…”
According to recent research by Oxford University and Deloitte, about 35% of current jobs in the UK are at risk of computerisation over the next twenty years. Interestingly, journalism scored a relatively low 8% likelihood of automation, placing 285th on a list of 366 professions considered. The clergy came in at position 341. The job deemed most likely to be automated is that of telephone sales persons. Those looking to guarantee long term job security are advised to start looking for management opportunities in the hospitality sector. My own job, solicitor, came with a reassuringly low 3.5% likelihood of automation. For now, at least, I can feel some degree of job security as I return to considering who I might sue when my client is defamed by a robo-scribe whilst my lawbot gets busy dealing with my contract drafting.
The content of this article is for general information only. For further information regarding AI, please contact a member of Birketts’ Technology and Intellectual Property Team.
This article is from the January 2018 issue of Upload, our monthly newsletter for professionals with an interest in technology. To download the latest issue, please visit the newsletter section of our website. Law covered as at January 2018.
To keep up-to-date with the latest news, legal updates and seminar information, please register and select the areas that are of interest to you.
The content of this article is for general information only. It is not, and should not be taken as, legal advice. If you require any further information in relation to this article please contact the author in the first instance. Law covered as at January 2018.