21 December 2020 Department of Computer Science , Faculty , Feature , Media , Social Media & Digital Business

In April 2018, Hyeongcheol Kim flew to Montreal for work. The young PhD student was excited — it was his first time in the Canadian city and the conference he was about to attend was one of the biggest in his field of computer science. What’s more, Montreal was only a three hour journey from Quebec City, a place he had glimpsed many times on the small screen.

Kim had become familiar with the picturesque city, thanks to the television drama Goblin that was immensely popular in South Korea, where he grew up. Parts of the fantasy series, watched by more than 250 million viewers, were filmed in Quebec City. As Kim visited various sites that appeared in the show — such as the antique mailbox in the historic Château Frontenac hotel and the unassuming red door at Theatre Petit Champlain that served as a magic portal — he snapped countless pictures to share with friends and family back home.

“The trouble was, I was quite busy with the conference. I wanted to share the photos and many memorable moments I had, but was just too tired,” recalls Kim, who is currently completing his PhD at NUS Computing.

“When I started to write my travel blog after the trip, I realised I could not remember many things,” he says.

Frustrated, Kim wondered if there was an easy way to capture his thoughts and feelings in real-time, rather than relying on memory after the experience was over. “I wanted to create a new approach for experience writing, to allow people to capture precious information which can only be captured in that moment,” he explains.

Kim returned to Singapore and began discussing the problem with his supervisor Shengdong Zhao, associate professor and head of the NUS Human-Computer Interaction lab. The pair, together with their collaborators Can Liu and Kotaro Hara, invented LiveSnippets, an Android-based app that allows for what they describe as “voice-based live authoring of multimedia articles.”

A moment in time

The premise is wonderfully simple (click here to view a video of how it works). A user — say someone seeing the Eiffel Tower for the first time — pulls out his smartphone and starts to take pictures or videos. At the same time, he begins to describe what he sees, how he feels, and any other thoughts that comes to mind. The app captures all of this, along with contextual information such as time, date, and location, with a single press of a button. It also transcribes the user’s narration into text.

A photo of A/P Zhao Shengdong and Hyeongcheol Kim.LiveSnippets is a voice-based, multimedia journaling app that allows the user to verbally capture their thoughts, along with images and videos. Hyeongcheol Kim (right) developed the Android app with his supervisor A/P Shengdong Zhao (left), and two other collaborators, after wondering if there could be an easier way to capture a user’s thoughts and feelings in real-time while travelling.

A photo of PhD student Hyeongcheol Kim taking a picture of a bowl of bibimbap.

A photo of PhD student Hyeongcheol Kim taking a picture of a bowl of bibimbap.

“Everything is captured as a single package, what we call a snippet,” says Kim. “And a single experience can be made up of multiple snippets.” For example, when the person is walking around the base of the Eiffel Tower, riding to the top in a glass-walled lift, and later enjoying champagne at the bar.

The result is a “stack of snippets” that can be easily edited or reordered in the app itself. “When you can see a rough draft that already has your thoughts somewhat organised, you feel more motivated to publish it because all you have to do is touch it up a bit,” says Kim.

“So you feel the writing and publishing is much easier,” he says.

The user can then use the app to generate a HTML file of the travel blog or social media post — complete with pictures and captions, along with other details — that can be published with the click of a button. “It’s really quick,” says Kim.

A screenshot of LiveSnippetsThe app transcribes the user’s narration into text, and contextual information such as time, date, and location are also captured. The “stack of snippets” generated can then be easily edited or reordered in the app itself.

Write as you go
Apart from travel writing, LiveSnippets can be used for other types of experiential writing. “For example, it allows you to use your voice to create content while reviewing a product or demonstrating a cooking recipe,” says Kim. “These are all things that you can’t do in a traditional setting where you need to stop what you’re doing and sit in front of a computer to write.”

“LiveSnippets lets the users do more, which is why it’s so exciting,” he says.

“It’s completely different from the existing approach where you typically write after the event,” adds his supervisor Zhao. “With our approach, you can write a lot during the event itself. We give you the advantage of not needing to remember everything in a post-hoc fashion.”

To test the new app, the team gathered a small group with different writing backgrounds (expert versus novice writers). Each participant was briefed on how the app worked, then asked to generate three snippets and use those to write a short article based on one of three scenarios they were assigned to: taking a brief tour of a scenic spot close by, cooking one of their favourite dishes, or reviewing one of their favourite products.

“We were very satisfied with the results,” says Kim, “because we could see the possibility and feasibility of LiveSnippets as an alternative way to write articles.” Although some participants initially felt awkward narrating their thoughts and feelings out loud, they quickly got used to it and said that writing with the app was both “simpler and more engaging” compared to the traditional writing process, he says. Their findings were published in a recent paper here.

However, Kim admits that LiveSnippets may not be able to completely replace the traditional writing process. “It really depends on what each writer’s personality is,” he says. “Some prefer to create content and share their thoughts only after they’ve had enough time to reflect on a situation.”

A lifestyle change
In the future, the team hopes to explore if LiveSnippets can be integrated into smart glasses. “Currently, if you use a mobile phone, you still have to carry it about and it isn’t always convenient,” explains Zhao. “With smart glasses, it will be even easier for people to record their life moments.”

They also believe that this new approach can be used beyond experiential writing, helping people create content where and when they are inspired, for example while they are travelling on the train or strolling in the park.

“People think much more creatively when they’re outside and when they’re moving,” says Zhao. “So we hope we can liberate people from their offices, from their desks, and let people work, write, and create knowledge wherever they want to. And hopefully that will help them lead healthier lives too.”

He says: “Our goal is to fundamentally change how people create information.”

 

Paper:
LiveSnippets: Voice-based Live Authoring of Multimedia Articles About Experiences