It all started with the concept of blending the visible, the artwork in front of you, with the invisible, the artwork viewed through different analytical tools and its connections with other artworks.
denoting an instrument for observing, viewing, or examining: microscope | telescope .
Artscope is a augmented reality interface prototype for museums.
The interface is crafted to show more about the artwork standing in front of you without distracting you or interfering with your behavior.
Artscope recognizes and tracks the piece in front of you and it connects you with the invisible layers of information available.
Artscope was coded from the ground up in 24 hours during the HACK4DK2014 hackaton.
Artscope is a prototype, read more about this at the bottom of the page.
Invisible layers reveal nothing new about the artwork. All the information is already there, but is invisible to our eye. Analytical tools such as Xray, IR and UV reveal the history of the piece. Image analysis, on the other hand, reveals more about colors, their distribution and dominance.
Under X-ray illumination damage or alterations to paintings are suddenly revealed even though they are not visible to the naked eye. For example, X-rays show wormholes in a painting. Not only this one detail but also the broader history of a piece is revealed: the way any change is made reveals when modifications happened and vice versa.
A thorough interpretation of X-rays images needs to be performed by a heritage conservator.
Artscope can provide insight to conservators but also stimulate the curiosity of those simply enjoying the piece.
When UV light shines on a painting, certain materials are highlighted. Different types of varnish fluoresce with different shades. For example an old varnish has a stronger greenish fluorescence than a newer one. Conservators can also get information about the different pigments. They can for instance see whether a repair of a damage comes from the time in which the piece was originally painted. If a damage, for example, has been rectified long after the painting was created, materials that age in a different way than the originals may have been used fluoresce with a different color.
Infrared lights acts like your eyesight, on steroids. Infrared rays penetrate below the varnish layer and reveal the underlying drawings and sketches. Using infrared light conservators can search for a painting signature that can help to identify who created the painting. Infrared light pictures also show whether there have been changes in the original drawing along the years, thus revealing fun little details that are otherwise hidden from the viewer.
AstScope follows you, not vice versa. As soon as you step back and start to turn away from the piece similar artworks are presented for you to check out next, helping you experience your personal path in the museum collection.
Ever wanted to find an artwork created in the same region, or with the same technique, but in a different time? Artscope knows what artworks are available at the museum and will show you the ten most similar artworks. By showing the similarly broken down by its components, Artscope allows you to explore the museum following to your own path.
Similarity is learned with a machine learning algorithm that takes into account the semantic similarity between mediums and techniques, the time passed between the creation of two artworks and whether or not two artworks were created in the same country.
The team came to existence randomly and spontaneously during the #HACK4Dk first afternoon thanks to a shared interest in art, data, and modern technology. And, yes, quite a few beers.
Carl Emil Carlsen (CEC) is an artist, designer and teacher in the field of interactive media, specialized in interactive graphics.
Henri Suominen (HS) is an PhD fellow in physics, working on hardware implementations of quantum computing, at the Niels Bohr Institute.
Giulio Ungaretti (GU) is a data scientist with a particular interest in data visualization techniques, open source and works at eTilbudsavis.
Siri Vilbøl (SV) is working on her master thesis in conservation. She has a background in theater, working with set-design and costumes and has been doing interactive art installations with the group Presens.
As much as we would love to say: go and get it on the app store, we can't. Artscope is strictly a prototype. The code base is full of dirty hacks, and it works just for one artworks and with the API that was set-up ad-hoc by SMK during the hackaton.
Artscope was an exploration of what's possible in hacking our cultural heritage in a week end.
Since you made it so far, you deserve more details. The application for iOS is made with Unity, whereas all the data processing and image analysis is done in python. The data analysis is done beforehand and the results are hardcoded into the Unity application. It's not unreasonable to scale this set-up using a client-server approach. Artwork recognition and tracking is set-up just for one artwork.