Responsive Conversational Design as a Road to Web Sustainability

I’ve been learning more about conversational design, which is design of user interfaces that are chatbot or texting-like. Instead of a complex visual layout, you model interaction as a dialogue between the user and machine intelligence.

What’s the value for Sustainable Virtual Design? Three things. First, Conversational Design could replace complex visual interfaces with lots of JavaScript gobs with a few lines of text. This is a win for Sustainable Virtual Design, since the Ui is simpler and reduces files transferred through the Internet.

Secondly, on the energy side, it needs to be determined if the power needed to run a chatbot Ai is greater than, or less efficient than, sending several megabytes of data with multiple web pages. The same Ai – based computing running chatbots can make server farms more efficient, so there is a good chance that Conversational Ui is energy efficient.

Finally, on the human side, testing is needed to compare conversational interfaces with more visual Ui-based interfaces. Clumsy of difficult to use interfaces which keep people working online longer are generally counted against sustainability.

Using a text-driven interface automatically increases universal access, a.k.a. Universal Design

  1. Chatbots should work well with text readers, supporting users without vision. If we start with a conversational interface and work UP to mobile, we’ve included everyone at the low-end of visual Ui, mobile and text reader.
  2. Voice-driven commands also make working with the “Immobile Web” – Digital TVs with remotes instead of a keyboard – easier to use.
  3. The conversational Ui can also make it easier to navigate a complex Virtual Reality or Augmented Reality interface. The controllers used in VR and AR are difficult to work – they are like having a couple of clumsy erasers instead of hands. Voice control of VR space makes a great deal of sense.

However, the rise of Conversational Design doesn’t mean we abandon old (mobile) or new (VR, AR) interfaces. Instead, we define a new theory – Responsive Conversational Design.

Responsive Conversational Design and Content

The “Content First” strategy is basically writing out our content as a time-ordered interaction between user and site where user and website “take turns” as actors in a scripted encounter. This is essentially the same as a spoken dialogue between two individuals. Your “design” is pretty much the story you tell.

Leaving out visual Ui from a “content first” model converts our initial work to a screenplay like dialogue between the user and the site, with the site now acquiring a “site persona” role.

Now, just let a bit of Ui back in, when visual language, in the rare (rarer than many think) cases where “one picture is worth a bunch of chatbot words”

Image result for conversational design Ui

Congratulations, you’ve created a Conversational Design Interface!

So, let’s consider how we might work Conversational Design into a Progressive Enhancement Ux workflow. This makes the design and development process more sustainable, going beyond user experience.

In Responsive Conversational Design, Early Stage Ui becomes a Conversation

Converting web interaction to a partly-improvised screenplay isn’t hard, because early stage Ux deliverables look a lot like a chatbot. Consider storyboards, scenarios, Customer Journeys, and the like. They are basically written as a sort of screenplay, with actor roles for the user and Site Persona discussed by Aaron Walter (website or app pretending to be another person). You write a story for these actors.

The similarity of a chatbot to a Site Persona, a.k.a. Design Persona is huge. It means that these early-stage Ux deliverables become our Ui!

Related image

By making the website and the user both a set of communicating agents, we focus on problems and solutions, instead of just messing around with a “feature list” for app.


Responsive Conversational Ui Ensures Universal Access

Conversational Design also ties into another important area – accessibility. Text readers for the blind try to convert existing web pages into a conversation. If we base our design on a Conversational Design model, then we are naturally supporting interfaces for those who can’t see visual layouts. Instead of ARIA roles directing text readers to content, we provide the correct, time-ordered sequence of interaction directly. The list of users is larger than one might expect:

  • Blind people
  • People speaking to an Internet of Things device, like Amazon Alexa
  • Search engines, bots, and Artificial intelligences which parse text for a living.

Note that several of these users are machines. In the interest of not angering the future overlords, I suggest including them in our user base now!

“Content First” Replaces “Mobile First” in Our Design

The third aspect of conversation is that it suggests an update for our Ux models to make them more inclusive and responsive all the way down to the level of Content Strategy.

Initially, “mobile first” was brought on to ensure a good experience for all users, mobile and desktop. If you design for desktop and port to mobile, you tend to get a reduced, compress “second class” experience violating principles of Universal Design. Mobile First also encourages a stripped-down approach to site assets, reducing page complexity and indirectly carbon footprint. It is probably the best way to ensure that downstream you apply Responsive and Adaptive design strategy.

Image result for mobile first design

But the move to complex animated tools and widgets on mobile pushed up page bloat again, since a dynamic mobile interface with lots of funny animated “micro-interactions” beloved of the Sketch crowd often require a massive download of JavaScript to make things work.

But, in many formulations, “mobile first” is really “content first” – a recognition that the information we convey is the most important part of what users get.

The “pure conversation” phase is essentially a screenplay or scripted dialogue. It maps to interface experienced by a machine (e.g. search engine) and also what a blind person using a text reader would experience

After we put in the conversation,  typical web design Ui becomes a “prop” in the script used both by the Design Persona and the user.

Responsive across VR and AR

Our design also needs to be responsive – across the whole space from screenless IoT devices to VR/AR 3D virtual worlds. Can this be done? Yup.  Arturo Paracuellos at has already done a great job of defining how to incorporate WebXR (VR and AR via JavaScript) into an existing web experience.


Maslow’s hierarchy of a Progressive Enhancement project

What we need to do is to add a bottom layer of conversation at both ends…

  1. At the level before a standard “Mobile First” Ui
  2. At the VR/AR level, where voice-driven interfaces are easier to use

This suggests that the design can be rolled up into a circular system, like so. The connection of conversational interfaces is actually stronger for 3D VR/AR than it is for advanced 2D web interfaces.

Conversational Design Relation to Ui elements

A Few Caveats

Conversational Design works best for pulling in people who have problems with those who are confused by standard web apps, as well as cases where there is no screen – e.g. Internet of Things (IoT) or a VR/AR 3D world.

In practice, pure conversation may not be best solution. Studies show that pure talking takes a greater mental load, and in fact may be less effective than fill-out forms in many cases. So our Conversational Design might not be a pure voice – it might, instead be an agent which speaks and presents visual and other interface elements as needed or effective. It is a bit like someone showing us around a car dealership, as compared to talking with someone on the phone from the same dealership. In other words, we “deliver our form within a conversation”

example of Operator delivering a conversational form

But that’s OK. Pure chat is just a “design breakpoint” in a Responsive Conversational Design system.

What we really need is the Site Persona not just talking but showing us stuff. The images above have an example. After a “discussion” with the user, the Site Persona chatbot elects to show them a standard fill-out form.

The Storyboard Becomes the Primary Ux Deliverable

You start by writing a dialogue between your actors. Then, you add comic-book style visual panels to compliment the dialogue.

Image result for storyboard animated movie

The above was for an animated film. The one below is for a Ux project.

UX Storyboard Wes Anderson

Note the huge similarity between the storyboards.

As said before, this is essentially our Ui. Our chatbot agent presents stuff, which may including having the user choose between stage props, or embedding the user in a “scene” – a.k.a. virtual world.

A New Era of Design

The proponents of Universal Design have wanted to support anyone, anytime, through all forms of devices, visual and other elements, and user understanding. Responsive Conversational Design may help us make Universal Design more sustainable.




Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.