Saturday 4 March 2023

Scientific Illiteracy, Decision Making and ChatGPT?

In an ideal democracy (I appreciate that this is entirely hypothetical), scientists would advise, policy makers would act accordingly and journalists would hold both groups to account, by pursuing the truth. Recent history/developments appear, however, to be making any such arrangement inherently unlikely. Kit Yates (University of Bath) points to the scientific illiteracy of the UK's former PM revealed in 100,000 leaked WhatsApp messages in the Covid19 pandemic (https://www.theguardian.com/commentisfree/2023/mar/03/boris-johnson-science-covid-maths-whatapps-advisers). The former PM seemingly didn't know the difference between 0.04 (i.e. 4%) and 0.04% (0.0004). That distinction was crucial, when determining the fatality ratio for the disease. That same PM, was enthuiastic about the now discredited concept of 'herd immunity'. He consequently suggested that isolating the over 65's should be made optional, claiming their risk of dying from a Covid19 infection was similar to that after falling downstairs. In the UK, circa 550 over 65's die after falling downstairs. In 2020, in England and Wales alone, over 60,000 over 65's died after a Covid infection. The PM even claimed that, getting Covid made people live longer, as the average age at death, exceeded life expectancies. Life expectancy is, however, measured from birth, whereas Covid infections were more risky for older folk (they had attained their age). Yates suggests at this time, a similar level of scientific illiteracy was evident in the US President (he suggested injecting bleach to destroy the virus). Yates constrasts the UK and USA with Germany, whose science-trained leader, appeared to be able to run a more effective programme. The transition from best scientific advice to sensible policy may well, however, be becoming even more problematic. Emily Bell (Columbia University) suggests that the development of AI bots like ChatGPT will be disasterous for 'truth' in journalism (https://www.theguardian.com/commentisfree/2023/mar/03/fake-news-chatgpt-truth-journalism-disinformation). ChatGPT (and its relatives) mimic human writing with zero commitment to the truth. Bell claims this development will prove 'a gift' to those who benefit from misinformation. She thinks there's an urgent need to regulate 'journalism' (in papers or online) that uses AI technology (this output is often convincing but it can have spectacularly wrong conclusions). As is said to an enemy, 'May you live in interesting times'.

No comments:

Avian 'Flu Flying In?

People get zoonotic diseases from other animals. There are, consequently, concerns, when a virus increases its range of hosts. This is esp...