Skip to main content

Sketches, Doodles, UI Patterns or Poodles? Conditioned Responses, User Interfaces: Part 2

Author by Damon Sanchez

In Part 1 of this series we talked about the essential facets of User Experience and how they work in unison to create an opportunity that expresses brand and how this unity can be connected to a positive experience.  We then took a deeper dive into the facet of Usability by explaining UI Patterns and how repetition plays a key role in how users perceive good UI.

Part 1 also briefly uncovered the concept of Pavlovian conditioned responses and the relationship with User Interface Design which can be summed up as; if a user has been conditioned to respond to a UI Pattern, going against the conditioned response isn’t bad… but that it introduces extra cognitive processing because the user has to figure out how the new UI Pattern works.

Cognitive Processing was mentioned as well but wasn’t explained at depth.  In Part 2 of this series we are going to get geeky and talk about UI and Cognitive Processing at greater length and how they can be expressed and worked through with communal Sketching and White boarding.

So… What The Heck Is Cognitive Processing, and How Does It Relate to UI and UX?

There are loads of studies written on Cognitive Processing, but an easy way to visualize how it works inside User Interface Design is to hold up 2 different UI Patterns next to each other.  



Notice that the UI Pattern on the left has only 4 points of interaction and the UI Pattern on the right has almost 100 points of interaction.  The concept of Cognitive Processing starts to make sense when looking at UI in this way. 
Essentially the more information contained within the UI the more the user has to process and rely on conditioning in order to interact with the interface.

Okay… So How Do We Combine UI Patterns Into Our Applications While Keeping Cognitive Processing to a Minimum?

Now we finally get to the good stuff, based on the concepts we talked about in Part 1 dealing with repetition in UI, and now with an understanding how that information has to be processed by the user we can pull out the pencils and stinky markers...

User Interfaces are essentially the build-up of UI Patterns that once combined together fulfill user tasks or needs.  The more UI Patterns are combine together ultimately the more complex an application becomes, so the goal is to accomplish tasks with the least amount of UI necessary while keeping the interface functional.

We do this by rapidly sketching out what the Interface, workflow and nuances could look like.  An even better situation is to do this within an ear shot of the expected user base so that their input can be rendered into the sketch as well.


The very act of drawing an object, however badly, swiftly takes the drawer from a woolly sense of what the object looks like to a precise awareness of its components, parts and particularities.
Alain de Botton

What makes Alain de Botton’s quote so awesome is that sketching and white boarding sessions are no longer bound to paper and pencils, or stinky markers.   Tools like Skype for Business and OneNote are reimagining the way UI and UX Engineers collaborate with teams and clients making it even more possible to be within an ear shot of their input.  Here are some examples of the kind of white board sketches created in a typical Concurrency UI UX workshop.

Notice in some cases the Sketched Interface or Low Fidelity Wireframes are just expressing content areas, where as the middle sketch is expressing the UI Elements of a data table and grid with products in it.  We can quickly see that the UI Patterns encapsulating the product grids are starting to get pretty complex.  Which means Cognitive Processing if not kept in check will start to degrade the perception of usability.

In the case of the sketches presented above they were created for clients and teams remotely through Skype for Business and OneNote.


In this next series of Sketched Interfaces or Low Fidelity Wireframes there are a fair amount of UI Elements being depicting.  Notice the bottom of the sketches, in that there is an attempt to work out how the global navigation UI Pattern will function.

We come full circle here yet again in these last set of Sketches not only just in the theory of Cognitive Processing and Conditioned Responses but in the simple notion of needing to know who our users are or getting their feedback.  By having a good idea of the User Profiles in this sketch we can accurately predict or document what UI Patterns the users have been Conditioned to expect, therefore giving us a higher probability that our users will perceive good UI… by the way they chose the last one  :)

We’ve covered UI Patterns and Low Fidelity Sketches taking into account Cognitive Processing and Conditioned Response and how these facets of Usability help give us a glimpse into how a user perceives UI.

In my next blog post we will be switching gears from UI and UX theory and moving into the realm of Axure and High Fidelity Wireframes.  I’ll be showing how Concurrency uses Axure to collaborate with clients and teams to capture questions, comments and generate group consensus.