Sherlock.jpeg

Sherlock Holmes & the Internet of Things

Sherlock Holmes & the Internet of Things is an ongoing experiment on the art of creating an immersive storytelling experience enhanced by emerging technology such as IoT & AI (IBM Watson), in collaboration with the Columbia Digital Storytelling Lab

Inspiration

Inspiration

"Any sufficiently advanced technology is indistinguishable from magic" - Arthur C. Clarke

What if by examining the works of Sherlock Holmes we could gain a better understanding of the emergent technology that surrounds us today?

Sherlock holmes & the Internet of Things is a project led by the Columbia Digital Storytelling Lab, directed by Lance Weiler and Nick Fortugno. While the project was started in 2014, my involvement in it began in the later part of 2016, where I helped with the game design, storytelling, engineering and user testing. 

Our design question: 

How could we harness technology to evoke empathy and emotion, while at the same time enabling an intuitive, invisible and fluid storytelling experience for the audience in order to empower them to become fellow collaborators?

How it works

How it works

Here is a simplified version of how the game works: 

This is a physical game experience where a team of 5 enters a "crime" scene where there is a "dead" body and a couple of objects placed around it. A rotary phone rings! Someone picks it up, the person on the other end tells him that the objects placed around the body are a set of clues. Their goal is to "solve" the crime scene - through phone conversations they would get a clue about each object, which would then help them derive the narrative of what happened. 

The interesting challenge for us is that the phone here is actually powered by a Raspberry Pi, and there is no person on the other end of the phone - everything is done via IBM's Watson APIs.

We wanted to see if we could seamlessly integrate IoT and AI to create a more immersive and delightful game experience. There were, however, many challenges: we had trouble getting beacons to work, and with the Rank and Retrieve/Tone Analyzer Watson APIs. Fortunately, we had help from IBM who were very supportive. 

 

Reflections

Reflections

I was very inspired by this project as this was the first time I was designing an experience that involved direct feedback between the digital and physical world. I couldn't stop thinking about other use cases of AI and IoT, and it also made me wonder about the possibility of other combinations of emerging technology - AI and VR? AI and AR? 

This project culminated at the Lincoln Center during the New York Film Festival. Living in New York I frequently go to the Lincoln Center for concerts, but this was the first time I had the chance to actually work on a project with them. The New York Film Festival was also an eye-opening experience: the main theme of the festival was "Convergence", referring to a mishmash of immersive theater, game design, virtual and augmented reality as the future of interactive storytelling. Through this project I was driven to learn more about VR as a storytelling medium.

Ask me about

Ask me about

·      Physical game experience design

·      Integration of Watson APIs with a Raspberry Pi

·      Interactive storytelling