Alan Morris
19 Sept 2025
Alan Morris considers the impact of AI on art practice
As the gallery quickly fills, wine, crystal cold, cuts through the summer evening fugg. Doors open, the insistent traffic punctuates the gentle chatter. A large chocolate cake sits proudly next to the diminishing row of glasses.
The occasion is the opening of Threshold at the Assembly Arts gallery. Held in June the show is an exhibition of work produced and curated by second year Fine Art students at Lancaster University. Thirteen undergraduates are showing a variety of artworks in the space dedicated to contemporary art. The first of its kind, the exhibition came about after a conversation with Tilly, one of the students Assembly Arts has been mentoring over the past year.
As art students begin to return to campus many in their final year will be thinking about next years impending degree show. For some the exhibition will be the first time they have displayed work to a public audience. This lack of exposure in art and design programmes is perhaps something of an anomaly. Showcasing their students' best work and providing an excellent marketing opportunity, quite rightly art schools confer great importance to degree shows and yet they very rarely give first and second year students' a chance to rehearse some of the skills necessary for success. Whilst a seemingly inherent aspect of art and design courses, would this lack of opportunity be acceptable for other degrees? Would it be appropriate for a music student, for example, to play to the public at their final assessed concert performance for the first time? Or what about an athlete at a sports academy who has never competed in public before graduating? Understanding that (as with musicians and athletes) it is essential creatives are given opportunities to rehearse important skills, Assembly Arts were happy to offer the gallery to Tilly and her peers. Indeed, self funded and relying on no public funds or grant aid, Assembly Arts nevertheless places great emphasis on collegiality and mutual support. We value collaboration whilst also providing opportunities for others to develop their creative practice. This philosophy extends to emerging artists and we were pleased when Tilly and her fellow students at Lancaster accepted our invitation to exhibit their work in our gallery.
Combining painting and drawing with 3D and photography, Threshold was a vibrant and varied exhibition. Equal to the success of the show itself was the work undertaken in the preceding months. Meeting and planning independently from their tutors, the students successfully selected, organised, publicised and invigilated the exhibition which coincided with the main Lancaster Institute for the Contemporary Arts (LICA) Festival at Lancaster University. Significantly, in addition to the hard skills required to plan and implement such a show, to their credit the students successfully demonstrated a range of soft skills necessary to negotiate and mount a public facing exhibition. One student, Tim, also demonstrated excellent culinary skills: the chocolate cake was delicious.
In an era when artificial intelligence is having a profound impact on our individual interactions and perception of a seemingly ever more complex and dangerous world, it appears that opportunities to create and interact with real artworks may become increasingly threatened. Creativity and the meaningful dissemination of art is becoming more problematic, not least because of the reduction in the value assigned to the arts generally but also because of the unrelentless adoption of AI. In the 2025 Spending Review, the Chancellor unveiled plans for how much public money each government department will have to spend over the next three years whilst Rachel Reeves asserted that the government’s priority is “to ensure that renewal is felt in people’s everyday lives”. Despite this ambition there will infact be more real-term cuts to the Department of Culture, Media and Sport, combined with continued neglect of the Higher Education sector disproportionately affecting arts courses. Reflecting the collapse in local government investment in the arts, the Campaign for the Arts recently stated "All this will make it harder for the artists and cultural organisations, teachers and researchers, local groups and venues that we need to unlock the potential of the arts for everybody".
Despite ongoing fiscal constraints on the arts imposed by successive governments following years of austerity since the 2008 financial crash, the impact of Brexit, the pandemic and now the fall out from a second Trump administration, the Campaign for the Arts nevertheless believes that there are perhaps some reasons for optimism, asserting that real-term boosts for schools and councils will go some way to alleviate the financial pressures that have caused the arts to be squeezed over recent decades. It asserts that "... using dormant assets to invest in school libraries and young people’s access to the performing arts is a step in the right direction". In addition to such initiatives the underlying message must be that if as a society we really value vibrant, equal, happy and healthy citizens we must make appropriate investments in culture and that the decade of national renewal that Rachel Reeves aspires to will only come about if we ensure meaningful access to the arts.
With freshers and returners settling into student life it seems that future generations may look back at 2025 as the year when the wide spread use of AI became part of accepted academic practice. This development is likely to have significant implications for art students such as Tilly and her peers as they embark on preparing for a career in the so called 'creative industries'. In addition to the economic threats to the arts, conventional creative practice is being redefined by 'advancements' in artificial intelligence. Whilst commentators appear to fall into one of three camps - 'fearful Luddite' , 'pro' or 'blissfully ambivalent' - perhaps all parties should take note of a recent study published by the Massachusetts Institute of Technology (MIT) Media Lab which suggests that AI is demonstrably reducing our cognitive abilities.
The MIT study specifically set out to discover the human effects caused by the repeated adoption of ChatGPT in particular. Dividing its subjects into three groups, the MIT researchers asked each participant to write a series of essays. Crucially, one group was allowed to use ChatGPT, one group was allowed to use the Google search engine and the remaining group was defined as brain only, meaning participants could use no tools at all. Moreover, when assessing the results of the experiment, the researchers didn't just examine the quality of the essays: they used an electroencephalogram (EEG) to measure and record the electrical activity of the brains of the participants in order to monitor the neural and behavioural consequences of each approach.
Shockingly, although perhaps unsurprisingly (full disclaimer, I sit firmly in the 'fearful Luddite' camp...) the researchers found that the ChatGPT users "... consistently under performed at neural linguistic and behavioural levels". They found that compared to the brain only group, the ChatGPT users had reduced neural connectivity, impaired memory, quote recall and a "reduced sense of ownership". By the end of the study many of the ChatGPT participants resorted to simply copying and pasting AI content into their essays. Significantly, because of the enhanced sense of ownership, those that wrote the Google assisted or brain only essays experienced higher levels of satisfaction with their work. Whilst such conclusions have profound consequences on our sense and understanding of creativity, equally concerning is that the study showed that the repeated use of the AI tool could have longer lasting adverse impacts. For example, when the ChatGPT group were asked to produce brain only written work they did not return to the cognitive levels of the group that had been writing without the tool from the beginning. The researchers defined this as "an accumulation of cognitive debt" whereby "...repeated reliance on external systems like large language models (LLM's) replaces effortful cognitive processes required for independent thinking, resulting in diminished performance when those external systems are removed" This finding has led the researchers to voice concerns that large language models could fundamentally restructure cognitive architecture. Or put simply: AI is mashing our brains.
Whilst the results of this specific research are disturbing in themselves this is in fact just the latest of a number of studies the MIT Lab have conducted in order to research the social impact of the wide spread use of AI. An earlier MIT study analysed millions of interactions and found that daily usage of ChatGPT correlated with higher levels of loneliness and dependence, along with problematic use and lower socialisation. Worryingly it also found that the heaviest users were more likely to use the LLM's for emotional support. Before doom-mongers like myself become too pessimistic, it is important to note that the study recognised that whilst drawing its conclusion it found a correlation not causation. In other words, it could be that people who are already lonely might be more likely to use AI to seek emotional bonds. Nevertheless, it is undeniably perplexing that the prevalence of LLM's is having a significant impact on how some members of society navigate the world through their close relationship with such technologies.
Another significant finding from the MIT study relates to the "sense of satisfaction" felt by the participants of the recent research. The investigation found that whilst the people who had completed the Google or brain only papers felt emotionally positive, accomplished, happy and proud, those participants who used ChatGPT not only felt indifferent, they were unable to recall what they had actually written. As the widespread (and as yet unregulated) adoption of AI continues to impact aspects of our lives, the MIT research seems to provide compelling evidence relating to the wisdom of relinquishing critical thinking skills to large language models. Making their assessment, the MIT researchers concluded that the brain only and Google search papers were generally more diverse in terms of both their quality and approach and in relation to the creative synergies that went into the writing. On the other hand, unsurprisingly, the ChatGPT essays were very standardised in terms of quality and content. This kind of acceptable, normalised, undermining of knowledge, with everyone giving similar, standardised responses, must surely have adverse consequences on human endeavour as the dynamism of creative thought is diminished?
Whilst academic institutions like Lancaster University continue to grapple with the widespread adoption and misuse of AI, it is fair to say that, for the time being at least, art departments remain outside of some of its most damaging impacts. Art students like Tilly and her peers who contributed to the Threshold show, and who will be proudly exhibiting their work next summer, simply cannot out source their practice to such technologies. As the Threshold exhibition demonstrated, the consequence is that many art students remain socially engaged, confident and ethical, appreciative of the genuine, visceral connectivity that producing and sharing art in the public domain provides. Witnessing how the students helped one another, including at the end of the private view as they boxed up glasses and brushed away the remains of chocolate cake, it was evident that they were demonstrating their own form of AI: Artists' Intelligence.
