Mycroft AI
  • Documentation
  • About Mycroft AI
    • Why use Mycroft AI?
    • Glossary of terms
    • Contributing
    • FAQ
  • Using Mycroft AI
    • Get Mycroft
      • Mark II
        • Mark II Dev Kit
      • Mark 1
      • Picroft
      • Linux
      • Mac OS and Windows with VirtualBox
      • Docker
      • Android
    • Pairing Your Device
    • Basic Commands
    • Installing New Skills
    • Customizations
      • Configuration Manager
      • mycroft.conf
      • Languages
        • Français (French)
        • Deutsch (German)
      • Using a Custom Wake Word
      • Speech-To-Text
      • Text-To-Speech
    • Troubleshooting
      • General Troubleshooting
      • Audio Troubleshooting
      • Wake Word Troubleshooting
      • Log Files
      • Support Skill
      • Getting more support
  • Skill Development
    • Voice User Interface Design Guidelines
      • What can a Skill do?
      • Design Process
      • Voice Assistant Personas
      • Interactions
        • Intents
        • Statements and Prompts
        • Confirmations
      • Conversations
      • Error Handling
      • Example Interaction Script
      • Prototyping
      • Design to Development
    • Development Setup
      • Python Resources
      • Your First Skill
    • Skill Structure
      • Lifecycle Methods
      • Logging
      • Skill Settings
      • Dependencies
        • Manifest.yml
        • Requirements files
      • Filesystem access
      • Skill API
    • Integration Tests
      • Test Steps
      • Scenario Outlines
      • Test Runner
      • Reviewing the Report
      • Adding Custom Steps
      • Old Test System
    • User interaction
      • Intents
        • Padatious Intents
        • Adapt Intents
      • Statements
      • Prompts
      • Parsing Utterances
      • Confirmations
      • Conversational Context
      • Converse
    • Displaying information
      • GUI Framework
      • Show Simple Content
      • Mycroft-GUI on a PC
      • Mark 1 Display
    • Advanced Skill Types
      • Fallback Skill
      • Common Play Framework
      • Common Query Framework
      • Common IoT Framework
    • Mycroft Skills Manager
      • Troubleshooting
    • Marketplace Submission
      • Skills Acceptance Process
        • Information Review Template
        • Code Review Template
        • Functional Review Template
        • Combined Template
      • Skill README.md
    • FAQ
  • Mycroft Technologies
    • Technology Overview
    • Roadmap
    • Mycroft Core
      • MessageBus
      • Message Types
      • Services
        • Enclosure
        • Voice Service
        • Audio Service
        • Skills Service
      • Plugins
        • Audioservice Plugins
        • STT Plugins
        • TTS Plugins
        • Wake Word Plugins
      • Testing
      • Legacy Repo
    • Adapt
      • Adapt Examples
      • Adapt Tutorial
    • Lingua Franca
    • Mimic TTS
      • Mimic 3
      • Mimic 2
      • Mimic 1
      • Mimic Recording Studio
    • Mycroft GUI
      • Remote STT and TTS
    • Mycroft Skills Kit
    • Mycroft Skills Manager
    • Padatious
    • Precise
    • Platforms
Powered by GitBook
On this page

Was this helpful?

  1. Skill Development
  2. Voice User Interface Design Guidelines

Design Process

At Mycroft we are advocates for a User Centered Design or Design Thinking approach.

PreviousWhat can a Skill do?NextVoice Assistant Personas

Last updated 5 years ago

Was this helpful?

The basic Design Thinking process is:

  1. Empathize

  2. Design

  3. Ideate

  4. Prototype

  5. Test

  6. Then rinse and repeat until your product adequately meets the user's needs.

You can learn more about the Design Thinking process at . Don't be intimidated by Prototyping and Testing. The great thing about Voice Interactions is that the prototypes can be very low-fidelity with no programming necessary. All you need to do is find a few willing participants to act out the interaction of your skill. You can read more about prototype testing later in this guide.

Once you have decided what problem you want to address with your skill it's best to start thinking about the user's jobs to be done. Jobs Stories are similar to User Stories, but we find them to be a little more streamlined. The basic problem is defined in three parts, The Situation, The Motivation and Goal, and the Intended Outcome. They can be written like this:

When ____, I want to ____, so I can ____.

Throughout the Voice User Interface Design Guidelines we will be taking a look at example work from a Moon Phase Skill that is designed to give the user information about the Moon phase. Below is an example Job Story from the Moon Phase Skill.

When I'm thinking about taking moon photography, I want to know what day the next full moon will be, so I can plan on taking photos of the moon that night.

The great part about Job Stories is that they do not dictate a solution. For example this job story could be resolved with a traditional mobile app. However, using a voice interaction is probably quicker than launching an app. Thinking about your user's needs in terms of Job Stories helps you determine whether or not a voice interaction is the best solution.

interaction-design