Now I must begin this with the confession that I have used Artificial Intelligence (AI) in the past. I have even written for RockArtBlog with the aid of AI. My tongue in cheek posting of 10 December 2022 “Have We Ever Found A Cave Painting Of A Salad?” was accompanied by illustrations generated by the AI program ChatGPT. I have also written about the use of AI in discovering new Nazca geoglyphs. My column for 29 February 2020 covered a team of Japanese researchers who used AI to scan aerial photographs of the Nazca plain for undiscovered imagery.
This
column, however, will be a somewhat different approach to the question of AI,
and its use in rock art. This is a warning, a cautionary tale of how AI can lie
to its user very plausibly and convincingly. This is a subject that has been
much in the news lately and so, came easily to mind when the following
occurred.
While researching a column for RockArtBlog on the so-called “Desert Kites” in Jordan I found myself wondering how they got that name, where it came from? There has been a great deal of recent news about using AI for research and so I asked my goto program ChatGPT “why are the ancient stone hunting traps in Jordan named kites?” It came back with a short, but very detailed answer. “The ancient stone hunting traps in Jordan are named "kites" due to their resemblance to the shape of a kite, the flying object. These structures, found predominantly in the arid regions of Jordan, were used for hunting purposes thousands of years ago. The term "kite" was coined by the British Royal Air Force pilot and archaeologist, Sir Tony Killick, who first identified and studied these structures in the 1920s.” (ChatGPT)
While this answer seems plausible I decided to try to check on it online. With some detailed searching I failed to find a Sir Tony Killick in the RAF in Jordan in the 1920s. I also failed to find a British archaeologist named Tony Killick (this is not to say he does not exist – but I found no records). I even searched in the UK National Archives of Royal Air Force Personal but came up empty handed.
So I went back to Chat GPT and asked the same question but with a slightly changed wording – “why are stone hunting traps in Jordan named kites?” This time the answer came back as “The term "kite" is believed to have originated from the early aerial photographs taken by British Royal Air Force pilots in the early 20th century. When these pilots flew over the arid landscapes of Jordan and other regions in the Middle East, they noticed these distinctive stone structures, which resembled the shape of kites when seen from above.” An answer equally (if not more) plausible, but one that I have also failed to confirm.
My point is this – there were two different answers for essentially the same question – where did Sir Tony Killick go? And while these answers had much in common they differed in important details. One (or both) of them may be right, but I have no way to confirm that. In 2020, Heather Roff wrote “The first step towards preparing for our coming AI future is to recognize that such systems already do deceive, and are likely to continue to deceive. How that deception occurs, whether it is a desirable trait (such as with our adaptive swarms), and whether we can actually detect when it is occurring are going to be ongoing challenges.
A recent headline on a science news website I visit regularly stated “Archaeologists Use AI To Identify New Archaeological Sites in Mesopotamia.” I would imagine that using AI like this is reasonably safe, but after my recent inquiries I would not recommend using AI for seining for data online. Admittedly, it is much easier to ask ChatGPT to search for your answer than it is to go through it yourself, but, you may not be able to trust the answers.
REFERENCES:
Faris, Peter, 2020, New Nazca Geoglyphs Discovered, 29 February 2020, https://rockartblog.blogspot.com/2020/02/new-nazca-geoglyphs-discoverred.html.
Faris, Peter, 2022, Have We Ever Found A Cave Painting Of A Salad?, 10 December 2022, https://rockartblog.blogspot.com/search/label/salad.
https://openai.com/blog/chatgpt/
Roff, Heather, 2020, AI Deception: When Your Artificial Intelligence Learns To Lie, 24 February 2020, https://spectrum.ieee.org/ai-deception-when-your-ai-learns-to-lie. Accessed online 10 June 2023.
UK National Archives, Royal Air Force Personnel – National Archives, https://nationalarchives.gov.uk. Accessed online 9 June 2023.
No comments:
Post a Comment