Researchers from Griffith University have pioneered an innovative digital archaeology framework to probe one of humanity’s oldest artistic expressions—finger fluting. This primitive art form, characterized by finger traces etched onto cave walls coated with a soft mineral film called moonmilk, has long puzzled researchers, especially regarding the identity of its creators. By merging tactile experimentation and cutting-edge virtual reality (VR) technologies, the team sought to train artificial intelligence (AI) to distinguish the sex of individuals responsible for these ancient markings, illuminating aspects of early human behavior and cultural practices.
Finger flutings constitute a unique form of rock art wherein fingers are dragged across soft mineral deposits on cave surfaces, leaving behind linear impressions. Present in pitch-dark caves across Europe and Australia, these enigmatic traces date back tens of thousands—and in some cases hundreds of thousands—of years. The oldest discovered examples in France have been ascribed to Neanderthals, potentially extending human artistic activities back approximately 300,000 years. Despite their age and significance, fundamental questions about the makers of these marks—in particular, their sex—have remained unanswered, partly due to the limitations of traditional archaeological methodologies.
Historically, attempts to decode the identity of fluting artists relied on biometric analyses such as finger length ratios or hand size assessments. However, these approaches proved fraught with inconsistencies due to several confounding variables. Variability in finger pressure during fluting, diverse surface textures, and pigment distortions often compromised the reliability of measurements. Moreover, substantial morphological overlap between males and females diminished the effectiveness of purely anatomical markers. To address these challenges, the Griffith research team employed a digital archaeology approach that circumvents these anthropometric assumptions and applies machine learning techniques to the problem.
Central to this study were two controlled experiments involving 96 adult participants who each created a series of finger flutings under two conditions. The first involved a tactile session using an innovative moonmilk clay substitute designed to replicate the texture and responsiveness of real cave surfaces. The second took place in an immersive VR environment utilizing Meta Quest 3 headsets, allowing the team to simulate the act of fluting in a virtual setting. Each participant produced nine flutings twice—once physically and once virtually—generating a comprehensive dataset for subsequent analysis.
The collected images of the flutings served as input for training two prevalent image-recognition models. These neural networks were tasked with learning subtle visual patterns that might correlate with the sex of the creator. Crucially, the team sought to avoid overfitting—a common pitfall where models perform well on training data but fail to generalize to new, unseen samples. Rigorous validation metrics were employed to assess whether the AI was genuinely discerning intrinsic features of the flutings or merely memorizing extraneous artifacts unique to the experimental setup.
Results demonstrated a notable contrast between the VR and tactile datasets. Models trained on tactile replicates achieved approximately 84 percent accuracy in sex classification, suggesting that tangible interactions with the moonmilk-like substrate captured meaningful discriminative features. Additionally, one model exhibited a strong discrimination score indicative of reliable pattern recognition. By contrast, models processing VR-generated flutings failed to deliver consistent or balanced performance, underlining current technological limitations in simulating the nuanced physicality of ancient finger marks within virtual environments.
Nevertheless, the tactile models revealed some reliance on dataset-specific characteristics, such as minor artifacts introduced during the creation or image capture processes, rather than universal features of finger flutings themselves. This finding underscores the necessity for continued refinement, including diversification of datasets and enhanced feature extraction methods, to develop robust tools applicable across varied archaeological contexts. The researchers emphasize the iterative nature of this endeavor, positioning their framework as a foundational proof of concept that can evolve with further contributions from the interdisciplinary scientific community.
The study exemplifies the fusion of experimental archaeology with state-of-the-art machine learning workflows to investigate prehistoric cultural materials. By providing detailed protocols, open-source code, and comprehensive datasets, the team encourages replication, scrutiny, and expansion of their work. Such openness not only fosters transparency but also accelerates methodological advances essential for transforming digital archaeology into a rigorous and reproducible science, capable of addressing questions previously relegated to speculative interpretation.
Beyond archaeology, the implications of this research reverberate across fields such as forensic science, cognitive psychology, and human-computer interaction. Understanding how ancient humans engaged with their environments and expressed identity through subtle manual gestures enriches our comprehension of cognitive evolution and social dynamics. Moreover, the technical advances in tactile simulation and machine vision may inspire novel applications ranging from crime scene analysis to enhanced virtual training systems, highlighting the broad potential of this interdisciplinary approach.
Dr. Andrea Jalandoni, the study’s lead digital archaeologist from the Griffith Centre for Social and Cultural Research, articulated the cultural significance of identifying fluting creators. “Determining whether men or women made these marks can have profound real-world effects,” she noted, citing scenarios such as regulating site access based on cultural traditions. Previously employed biometric methods proved insufficient for such delicate distinctions, making the digital archaeology path not only innovative but necessary to address deeply rooted anthropological inquiries.
Contributing computational expertise, Dr. Gervase Tuxworth from the School of Information and Communication Technology explained that while the results are preliminary, they open an intriguing window into ancient human signatures preserved through millennia-old gestures. “Our models highlight promising avenues but also reveal challenges intrinsic to interpreting prehistoric art through modern technology,” he reflected, emphasizing the nuanced interplay between physical interaction data and digital representation.
Co-author Dr. Robert Haubt from the Australian Research Centre for Human Evolution underscored the study’s broader impact and future trajectory: “We’ve laid a scalable digital foundation that others can build upon, critique, and apply in diverse contexts. This democratization is key to transforming initial proof of concept into reliable archaeological tools that deepen our understanding of human history.” The published findings in Scientific Reports mark an important milestone, demonstrating the feasibility of integrating tactile archaeology, VR experimentation, and AI-driven analysis to unravel the identities hidden within ancient cave walls.
As machine learning and archaeological methodologies progressively intertwine, this groundbreaking research symbolizes a paradigm shift in how scientists probe the behavior and cultural expressions of early humans. While challenges remain, especially in refining data generalizability and simulation fidelity, the innovative digital archaeology framework developed by the Griffith team charts a promising path toward uncovering forgotten narratives embedded in the very fabric of prehistoric art.
Subject of Research: People
Article Title: Training AI to identify ancient artists
News Publication Date: 16-Oct-2025
Web References: http://dx.doi.org/10.1038/s41598-025-18098-4
References: Using digital archaeology and machine learning to determine sex in finger flutings, Scientific Reports
Image Credits: Andrea Jalandoni
Keywords: Archaeology, Artificial intelligence