You are on page 1of 9

BBC: Letter to my future self: Personalised skills prompt

Original brief
Personalised Skills Prompt.
Aim - users reflection on their own skills/experiences give them confidence and prompts them in new situations.
Functionality - An app which invites users to upload and tag around 20? photos of themselves in emotionally charged situations. The user tags these
photos with the skills shown such as resilience, commitment, achievement, fun loving. These photos can be summoned as needed later on. I.e. User
taps a tag word and a photo appears.
This should be mobile only (vision is for smart watch) and can be restricted to android although IoS nice to have. It doesnt need to be robust on older
versions.
The tool may go out on the BBCs Taster platform so will need to be technically compliant, secure and have a beautiful UX.

Background
How will an app help reinforce self-confidence, and help prompt with good ideas in new or challenging situations?

Investigation
In setting out the requirements for this app, we have made a number of assumptions. The most significant has been to model the experience of
dealing with a 'stressful' situation and how we might fit an app companion into this process.
First o, we're assuming that a stressful situation is such because we've never seen it before, or because a previous similar situation has ended with a
bad experience. However, there are positively stressful situations, too, which are counted as being exhilarating, or even hilarious.
For the app, all these sorts of situations are ones which we need to help the user assimilate. We start with modelling a feedback loop something like
this:

We are involved in an event; we react to the event; we record the event - and how it went - in our memories; we assimilate these memories into our
own experience. Sometimes, if we're aware of self-improvement, we'll want to make some sort of resolution about how we want to behave in a similar
situation in the future.
We want the app to be able to assist us at all points in this loop; especially when the event occurs the next time.
Here's how we see the app helping out:

Event / React / Record


The event is something which happens around us; once we are aware of its significance, it's really important to help our memory capture it. We want to
start to record our reaction to it as soon after it has happened as possible. Maybe - if we know the event is likely to be problematic in advance - we
can record it even as it is happening, so long as recording it doesn't get in the way. The most important thing is to be able to record useful information
as near to the event happening as possible. We're going to define this as prima facie data.

Reflect
Once the event has happened and we have finished reacting to it and its immediate consequences, we are into a much more reflective phase, where
we are trying to make some sense of the experiences. In our memory, we can revisit them, but they are often altered by how we feel. Our memories
can become corrupted even as we try to make sense of them. We want to be able to revisit the memory as often as we like; mull it over; change our
opinion of it, without changing the facts of what happened. We want to be able to start to understand why we acted as we did - good or bad - so we
can either change for the better in future, or reinforce something which worked well.

Resolve
Obviously, our period of introspection can't last for too long. Once we've figured out all we need to, we need to distill what we have learned; capture it
and signpost it. The resolution to the event is very important; it's what we will use to re-assure ourselves when we come this way again in the future. A
short encouraging message is needed, along with a reminder of positive attributes of our character we have used, or could use.

Event / React / Record / Revisit / Resolve


We're running around the feedback loop a second time: an event has occurred, and we have been here before. What have we learned about ourselves
from the last time the event happened? Did we react in the same way? Did we want to? What do we think about it this time?
We're making a very big assumption here; that the app was useful enough for us to stay using it long enough for an event to re-occur. That caveat
applied, we'll want to link together similar events to help us with our own journey; to reinforce the resolutions we made the last time: to practice.

Prepare / Event / React / Record / Revisit / Resolve


Now we're in a very mature phase of using the app; we know that an event is going to happen, which we may be feeling anxious about. It's happened
before and we want to be able to revisit our resolutions and our experiences to prepare ourselves for what is about to happen. Maybe get a boost from
the encouragement that our previous self has left us. Where possible we want to record the event and our reaction again, so that we can compare with
previous events and add it to the archive. Again, linking it with previous events will help us get a sense of our own progression. Leaving a message for
ourselves can be more and more constructive as we build on the knowledge we have amassed.

Browse. Be Proud.
Long-term use of the app could yield many benefits, but being able to move through a narrative of personal development, visiting the most important
times in our lives is perhaps the most compelling, as it adds a great deal of value to an activity we find ourselves doing in unguarded moments already;
browsing. Instead of a photo album, though, we have a self-curated story with a journey and an ending. Care would need to be taken to ensure that
the resolution of the story is always done positively, so that revisiting it is always encouraging. Ultimately, the message given by the app is that all
experience is good experience.

A note on privacy
This app is one which deals with the most intimate details of people's lives. All information is to be kept privately within the app, although its data may
be backed up, for transfer to another device. Data from the device is not to be shared.

Conclusions
We have ascertained the following important functions of the app:
Capturing an important event. Tagging and archiving.
Revisiting the event. Browsing and searching.
Capturing reflection. Annotating the event.
Building on experience. Linking similar events.
Providing an encouraging, satisfying resolution.
Finding wisdom: delving into a library of experience.
Further we have identified the following drivers:
Interaction: the more the app is used, the better.
Content: the more content types supported, the more expressive the experience.
Convenience: should be as easy as browsing a photo album.
Longevity: the more events captured, the more experience is available to draw on.

App Description
From the above investigation, here's a description of how a mobile app might work. We will use this to give an idea of UI design, components and an
estimate to get to a minimum viable product.

Common UI Themes
We will uphold the following themes right throughout the app:
Minimal UI: a UI which get out of the way when it isn't needed
Default Settings: Reduce fiddling at crucial points
Minimal Typing: Concentrate on choosing and browsing, rather than slow input of information. Use dictation / transcribing where possible for
text entry.
Use of tags for expression and search.
Use of Audio transcription to enable audio diaries to be searchable. Free Text as a search criterion
Reuse of components in dierent contexts to get the maximum functionality from minimal development eort.

Event Capture
The nature of any event worth capturing is that we are not going to want to spend time or undue attention on capturing it. If it does occur to us to
capture it, we want to spend as little time as possible doing that and then get back to the important stu. We need the device to oer a set of capture
methods, which are practically automatic, like a set of buttons on the homescreen, which capture quickly, and then get out of the way. Wherever
possible, we want to use persistent settings on the device to hold default values, so that we don't have to fiddle at the crucial point of capture.

Picture
Button on device homescreen. Shows camera + view finder, set up for stills, with preferred settings. Takes picture, dissapears.

Video
Button on device homescreen. Shows camera + view finder, set up for video, with preferred settings. Takes video. On 'stop', dissappears.

Audio
Button on device homescreen. Shows custom audio capture, with preferred settings. Takes audio. On 'stop', dissappears.

Audio Transcribe
Button on device homescreen. Shows custom audio capture, with preferred settings. Takes audio. On 'stop', dissapears. Background processing
occurs on audio. An unadorned text document is generated. Transcription is very useful, but limited and innacurate. Could be used for things like
searching and tagging, if not actual prima facia data.

Buered Audio
Button on device homescreen. Shows custom audio capture. On start, moves to the background, and is available through the 'notifications' bar. While
in the background, records the last x minutes / seconds of audio, where x can be set in the settings.

Location
We can set the option to tag event data with the location the device was in at the time, although primary data is not explicitly tagged. For instance, the
EXIF data in a picture file will never be written to. Instead the location of newly captured event data will be associated with the data.

The Event
To the App, an event is be made up of many peices of prima facie data. For example, we could take several photos, one after the other, then some
audio, describing how we feel. It's important that these bits of data are associated with the same event, so we need a persistent setting that provides
some control over this. For example:

New Event Setting


New Event After
Time period in hours, minutes, seconds, which is the length of time after the last capture, after which a new capture gets associated with a new event.
Ask if new event
True or False: after the New Event After period, ask the user if this capture is for a new event, or associate the capture with the most recent event.

Tagging and Archiving


The Inbox
When a piece of prima facie data has been captured, it's attached to the most recent Event. It's important that we can defer any other organisational
things we need to do with the event, until there is time to do it. So, each event is stored in the InBox. We'll call the event 'inchoate' - formless; it won't
mean anything until it's properly archived. Inchoate events in the InBox are simply shown by their date and time, with a thumbnail of the most
representative peice of data in it (like the first picture). We can do the following things to an item in the InBox:
View: browse all the captured media, playing video and audio
Delete: remove the event from the inbox - never to be seen again
Curate: opens the item for Curation

The Curation Wizard


Within the curation wizard, we can:
View: Browse through all the captured items, playing video and audio
Name the event: think up a more descriptive name: for exmaple "Making Jam with Rob"
Tag the event: associate words with the event.
Confirm our changes: and commit the event to the archive, or simply cancel.
Tagging
Curation tags are adjectives. we associate them with an event to remind us how it made us feel at the time. Later, we'll explore searching for events,
but for now it suces to let the user come up with the most expressive tags. Tagging is the most important part of Curation, and really should
encourage us to be expressive.
Again, we want to make this part of the process - the event's induction, if you like, to be as easy as possible to do: we want the user to give us this
basic information for their own good, otherwise, what's the point? We need to make it almost enjoyable, even though the user might be dealing with
something awful.
The Curation wizard should oer an invovative method of defining tags.
For example:
An input text box, which is active for dictation: say the word "happy". A the input text box fills with the word 'happy', but drops down a list of
thesaurus entries synonymous with it. Long pressing an entry pops up a show dictionary definition. Choose: "Contented" instead.
Finishing and confirming completes capture of the prima facia data. The event is archived, and ready to be reflected on.

Revisiting the event


Any events which have haven't yet been dealt with to a resolution are placed back in the InBox. (Resolved events are placed in the Archive, separating
the notion of a 'To-Do' list from a Library of useful information) We can filter the InBox in a variety of ways, for example 'New' (Inchoate) and
'Developing' (Archived) Events. We can also filter on tags: A bar at the top of the InBox will display our search options: we can provide search criteria
such as tags, and free text. Free text is useful for searching inchoate items which have no tags. Tag search allows us to choose from a selection of
tags which are already in use by the app.
In the same way as Inchoate Events, when we find a Developing Event in the Inbox, we can do the following:
View: browse all the captured media, playing video and audio
Delete: remove the event from the inbox - never to be seen again
Curate: opens the event for Curation
In the case of Developing Events, Viewing the item has two sections to its gallery: prima facie - the captures which are taken at the time of the event,
and 'supporting' items. As before, each gallery can be played, or browsed.

Capturing Reflection
We want to revisit an event, so that we can re-live it, and understand it. The App should give is the tools to do this, so that it's both easy to add the
annotations that we want to, AND easy to consume them, when we come back to the Event and want to refresh our understanding. It should be as
browsable as a scrap-book.

The Curation Wizard

When curating a Developing Event we can:


Add, Remove and Re-order annotations.
Archive the Event to the Library. (Resolution)
Add and remove Developing tags (but not remove prima facie tags)
We can add the following annotations (these are just like prima facie captures):
Pictures from the device (as a reference) - via a 'share' action.
Video from the device (as a reference) - via a 'share' action.
Audio from the device (as a reference) - via a 'share' action.
Links from the browser - via a 'share' action.
Live Audio from the App - from the capture facility.
Live Video from the App - from the capture facility.
Live Pictures from the App - from the capture facility.
Text
Events from the App.
If settings allow, all media is audio transcribed. Text is a last resort - the experience isn't very expressive on a phone.
We cannot remove prima facie captures.
The paste bin
Sending items to the App, via sharing from another app requires the user to be outside it. For this reason, we need to use a 'Paste Bin'. This is an area
of the Curation Wizard where items get placed, pending their attachment to an event. The PasteBin re-uses all gallery components, and can be
browsed in exactly the same way except that it is globally available. Items from the paste bin can be selected and used in the Event that is currently
shown in the Curator. We can send all supported media to the paste bin from within the app, too, including references to Events.

Resolution
Now we've done our thinking, we have made our comments and understood our experience. It's time to resolve our thoughts.
From the Curation wizard, we can choose the option to Archive the event to the Library. This will start the Archive Wizard.

Archive Wizard
The Archive Wizard will allow us to view the Event in the same ways as the Curation Wizard; browsing both the prima facie and Developing items, so
we can use them as inspiration. The Resolution Wizard allows us to:
Write a short message to our future self, who will be needing our wisdom
Build a set of positive statements about ourselves, in the light of this experience.
Add the best bits of 'treasure' from our Developing and prima facie galleries.
Link to similar events from the Library
Archive the Event to the Library
The statement builder is perhaps the most important part of the Wizard: it guides us to expressing 4 'I am...' statements, in the context of the Event.
We can choose only from a list of adjectives curated for positivity. For an example, see here
Once an Event is archived, it is listed in the 'Library'.
The Library has exactly the same format, as the InBox - we can can easily switch between the two of them.

The Library
The Library is searchable by tag and free text in exactly the way as the Inbox. It help us to find experiences with a resolution; ones which we have
worked on to a conclusion and which we value. Items are listed by the name given to them during curation, and have a thumbnail showing the first
peice of treasure.
On finding an item, we can do the following:
View
Move to the Inbox
Viewing the item allows us to View the Resolution only. If we want to go into the gory details, we must move the item back to the inbox!

Recent Events
Recent Events is a list of events which have recently been accessed, sorted by time of access, most recent first. This area is intended to be an easy
way to access the Event we are currently curating, after leaving the app, or looking at another Event, for reference. It should be accessed conveniently
in the main area of the app and complimentary to the InBox and the Library.

Proposal
We propose an Android app, utilising the following UI components, and configuration:

Home Screen
The device home screen will be populated with several icons:
* Main App * Capture Audio * Capture Still * Capture Video * Capture Buered Audio * Settings

Main Screen
UI: Tabbed UI, featuring InBox, Library and Recent allowing the user to move between them easily. When working, the Recent list allows the user to
come out of an event, use the Paste Bin from another event, and come back.

Capture Apps
UI + Action: Each capture app will be distributed with the application, but available globally, and launchable from the Homescreen.

Picture
Shows camera + view finder, set up for stills, with preferred settings. Takes picture, dissapears.

Video
Shows camera + view finder, set up for video, with preferred settings. Takes video. On 'stop', dissappears.

Audio
Button on device homescreen. Shows custom audio capture, with preferred settings. Takes audio. On 'stop', dissappears.

Audio Transcribe
Button on device homescreen. Shows custom audio capture, with preferred settings. Takes audio. On 'stop', dissapears. Background processing
occurs on audio. An unadorned text document is generated.

Buered Audio
Button on device homescreen. Shows custom audio capture. On start, moves to the background, and is available through the 'notifications' bar. While
in the background, records the last x minutes / seconds of audio, where x can be set in the settings.

Settings
UI: Addition to the standard Android Settings UI, providing settings for capture, app behaviour.

InBox, Library, Recent


UI: for aggregating Events. InBox shows inchoate and developing Events. Library shows Resolved, Archived Events. Both have a Search Component.
Both show context action menus.

Search Component
UI: A bar-type component which allows the user to define Tags and Free Text as search criteria. Provides a 'Search' button, to start the search. Free
Text is a simple text input view

Tag Search
UI: Tags are defined using a simple text view with a set of drop-down suggestions. Suggestions are compiled the tags already used in the App. Tag
suggestions can be queried from the UI and a definition from an online dictionary, supplied in the Word Definition View.

Word Definition view


UI: A pop-up view, showing a word definition from an online dictionary.

Inchoate Event Curation Wizard


A 3 part wizard + gallery, showning prima facie captured media.

Event Name component


UI: Text input

Tag Definition component


UI: Tags are defined using a simple text view with a set of drop-down suggestions. Suggestions are compiled using an online thesaurus. Tag
suggestions can be queried from the UI and a definition from an online dictionary, supplied in the Word Definition View.

Tag Cloud Component


UI: Associated with the Tag Definition. Tags may be added using the Tag Definition, and removed.

Confirmation Component
UI: Last part of the wizard, showing name and tag cloud.

Gallery
A UI component, able to show the following media types, browsable by swipe:
Audio
Audio Transcribed
Video

Still
Event
Internet Link
Local Link
Text
Where possible, the gallery will show a prepresentative image, or text indicating the media available from each item. On selection, the gallery will play
the item, using a suitable player. Gallery may show UI to enable selection and deletion of an item, and copying a reference to the item to the Paste Bin
(see later)

Audio Player
Audio Transcribed Player
Video Player
Still Player
Event Player (Inbox Curator, or Library Viewer)
Link Player (no external linkage, content only)
Text Player

Developing Event Curation Wizard


A 6 part swipeable wizard + gallery, showning prima facie captured media and Developing Media. Only Developing Media is editable.

Event Name component


UI: Text input

Tag Definition component


Reused: As above.

Gallery
Reused: As above.

Paste Bin component


Reused: As Gallery, above. Displays links which have been shared. Tapping on an item, causes it to be placed in the current Event.

Media Capture Component


UI: Action buttons to allow invokation of Capture Apps above, which capture media and share to the App, placing the result in the Paste Bin.

Confirmation Component
UI: Last part of the wizard, showing name and tag cloud. With a 'finish' or 'cancel' option.

Event Resolution Wizard


A 6 part swipeable wizard + gallery, showning prima facie captured media, Developing Media, and Treasure Media. Only Treasure Media is editable.

Event Name component


UI: Text input

Resolution Message component


UI: Text Input

Event Statement component


UI: 4 x 'I am' statement, with Positive Adjective Chooser

Positive Adjective Chooser


UI: Text Entry (or dictation) box. As word is typed, suggestions appear. Suggestions are curated positive adjectives.

Paste Bin component


Reused: As Gallery, above. Displays links which have been shared. Tapping on an item, causes it to be placed in the Treasure media collection and
shown in the gallery.

Gallery
Reused: As above.

Media Capture Component


Reused: As above.

Confirmation Component
UI: Last part of the wizard, showing name and tag cloud. With a 'finish' or 'cancel' option.

Event Resolution Viewer


Shows: the following UI components (all reused - all read-only)
Event Title
Personal Message
4 x Personal Event Statements
Treasure gallery
Menu Option to 'Unarchive' (transfers to inbox)

Data Components
The App will do a great deal of data archiving, searching and indexing. For this reason, it will need to utiise an SQL DB, within an Android Service,
providing a queuing mechanism, and thread safety. Marshalling and unmarshalling of data will be done using JSON, to a Schema.

Location Components
The App will need access to Location Based Services, and will use open source options.
.... Estimate Removed....

You might also like