'It ultimately enhances the creative process': How Paperboyo used Adobe Firefly AI to invent brand new UK racing tracks
Check out the incredible before-and-after images.
AI has been used to imagine all sorts of weird and wonderful 'what if's, and artist Rich McCor's latest collaboration with Adobe is no different. To celebrate the upcoming F1 season launch, McCor (known online as Paperboyo) has reimagined what F1 circuits would look like in famous locations across the UK, including Primrose Hill in London, and Library of Birmingham.
Paperboyo utilised Adobe Firefly’s Generative Fill tool to integrate famous tracks into these celebrated UK landmarks. Take a look at the finished images below (including before-and-after examples), and check out our exclusive interview with McCor about the process of using AI to create the pieces. And for the full lowdown on Adobe's AI tool, take a look at our guide to Adobe Firefly.
Tell us about the process of creating these images
I began by selecting several iconic locations across the UK that were recognisable yet surprising choices for a race rack. As a passionate fan of the sport, I aimed for a blend of street and track circuits, allowing me to explore creative edits while drawing inspiration from existing circuits. Concepts like Royal Liver Building in Liverpool and the Cliffs of Moher were undeniably influenced by the Monaco race. Initially, I conceptualised a rough layout for each circuit within the chosen location. For instance, with the Glenfinnan viaduct, I envisioned the track interacting with the railway arches and extending into the distance.
With Photoshop as my main creative tool, my next step involved using the text prompts in Generative Fill, such as "racing track," "grand prix circuit," and "asphalt," to initiate the track generation process. Each segment of the circuit, especially in the Glenfinnan edit, required separate generation and minor adjustments to achieve the final outcome. To create the appearance of a wet road, I utilised Firefly to generate the texture, and then composited it onto the road back in Photoshop. Similarly, grandstands, often composed of separate seating and roofs, were crafted in Firefly and later combined in post-production.
The thing I loved most was refining the small details to create a genuine sense of occasion at each location. This involved adding elements like boats in the water at the Cliffs of Moher, crowds gathered under umbrellas in Birmingham, and the Red Arrows soaring through London skies.
How much of a part did Adobe Firefly play in this project?
Get the Creative Bloq Newsletter
Daily design news, reviews, how-tos and more, as picked by the editors.
Firefly proved invaluable throughout the project, exceeding my initial expectations. While I anticipated it helping to generate grandstands and larger objects, it also was an invaluable tool for the finer details. For example, in the Birmingham edit, all splash effects were generated using Firefly and integrated into Photoshop using the lighten blend mode. It also excelled in producing accurate flags, significantly reducing the time I’d usually need to source flag imagery.
The more refined the prompts, the more specific the results, which made the editing process faster and more streamlined.
What made Adobe Firefly the right AI tool for this project compared with other tools?
Having a solid foundation in Photoshop, Firefly complimented my existing skills, and simplified the asset creation while giving me the exact finish I wanted. Its integration with Generative Fill made the workflow much smoother than other tools and simplifies the complexity of compositing tasks. Through using Firefly and Generative Fill, I could dedicate more creative energy to developing concepts rather than getting bogged down in the technical intricacies.
Has the advent of generative AI changed your approach to creating art, and if so, how?
Undoubtedly, Firefly helps with all stages of the creation process from the conceptualisation to the finished assets. Its intuitive interface allows me to create assets quickly, change them to exactly how I see them in my mind, and experiment with layout and scenery at the touch of a button. For instance, while initially considering placing the track atop the Cliffs of Moher, Firefly's spontaneous generation of a track through the cliffs prompted a complete conceptual rethink. This adaptability empowered me to explore and discard ideas more freely, ultimately enhancing the creative process.
Do you see AI playing an increasing role in your work in the future, and if so, how?
One of the most useful (and surprising) things about Firefly was that it was especially useful in the conceptualising stages. It was so easy to generate quick assets and move them around so that I could figure out layouts for the tracks and how I was going to build the scenery.
Originally for the Cliffs of Moher I was going to put the track on top of the Cliffs, but then just by messing around in Firefly it generated a track that went through the cliffs which inspired me to completely rethink the concept. I loved that about the tool. It's obviously known for creating realistic assets and compositing (via Generative Fill), but this project really showed me how useful Firefly can be for the actual creative too- by making the early stages of the project so much easier, it allowed me more time to spend on conceptualising by being willing to throw away ideas and try new ideas because it was so quick to do so.
Thank you for reading 5 articles this month* Join now for unlimited access
Enjoy your first month for just £1 / $1 / €1
*Read 5 free articles per month without a subscription
Join now for unlimited access
Try first month for just £1 / $1 / €1
Daniel John is Design Editor at Creative Bloq. He reports on the worlds of design, branding and lifestyle tech, and has covered several industry events including Milan Design Week, OFFF Barcelona and Adobe Max in Los Angeles.