The Affect of Iconography on a User’s Experience – Part 1 the Psychology of Symbolism in UI

blog_icons

My family and I visited the Crossroads of Civilization exhibit at the Milwaukee Museum the other day.  If you haven’t had a chance to visit the exhibit you should because it really speaks volumes about our attempt throughout the centuries to comprehend complex ideas.

If we think of how we interpret the material world around us it quickly becomes apparent that almost everything represents some form of information. When we grasp this notion, it isn’t a far stretch to consider that to understand the world we live in better we’ve come up with compact ways to represent complex ideas.

 

image001

Obviously a lot has changed since the pyramid builders of old, but on all accounts the same visual mechanics are at work. Whether it is letters constructed into words on a page that phonetically sound out the idea of a BIRD or a pictographic representation of a physical BIRD the same synaptic and ocular phenomenon is taking place.

One doesn’t have to put on a fedora and leather jacket to see hieroglyphics, in fact we see the ghosts of their impartial use at work when we type emoji’s in our messages.  Why write “happy” when you can send someone an icon.
Happy, feliz, счастлив, 开心,快, Gelukkig, srećan, Heureux

image004

This decoded pictogram transcends ethnic races, languages, and represents a multitude of complex ideas all the while our minds are registering a smiling face.  What’s even more amazing is that our minds register an emotional response to the iconography.

“Iconography, good iconography, strives to convey invisible reality in a visible form.” 
― Peter Pearson

In Part 1 of this blog post we are going to be discussing how the use of icons can positively affect a user’s experience through the concepts of metaphorical affordance and conditioned response.

Like we’ve talked about in previous blog posts building a log of macro user personas that represent a group of individuals helps us to not only understand who we are talking about and what generalized behaviors are similar but also what makes particular groups unique and special.

Questions like:

  • If a user gets confused who will they contact?
  • What is something that would make a person angry or upset if the interface didn’t do it well?
  • How long will a typical user be interacting with the application?

The goal of documenting human behavior inside a workshop focused on current state is then used to help dictate how an interface will ultimately fulfill needs in a future state.  This is in part because from a UI Practitioner’s perspective an application has less to do with colors, themes and branding and more to do with fulfilling needs through usability. 

One of the principles that drives good usability is called Affordance
Usability based on affordance is fairly simple to see running in the wild working incorrectly when we find nicely designed doors in an office building.

Here is an example of a “Norman Door

image005

These types of doors are found everywhere, and as you can see it has pull handles indicating that the door has the interaction of pull to open, while the actual interaction is push to open.  The basic affordance of the door is broken requiring the words PUSH to inform the user what they must do.

Much the same way the broken affordance of a Norman Door functions counter intuitively the icons, symbols and pictograms of a User Interface have a similar psychology when used incorrectly.

This is because while icons and pictograms hold an aesthetic value they also carry an inherent usability because of the affordance that they represent.

You can imagine in the case of a user persona with each behavioral question captured there is the possibility to represent symbolic characteristics that bridge communication barriers.

image008

It’s fairly safe to say what complex ideas or information this icon represents as we’ve all been classically conditioned to respond to this icon.  So we can say that the metaphorical affordance of this icon is high, in that upon clicking the user can assume they will get needs fulfilled that pertains to mail.

But what about this icon?

image009

Aside from representing the concept of climbing into a rocket ship and blasting off to the moon for outer space adventures… You could imagine much like the Norman Door example, in order for this icon to express meaning in a User Interface it would need an explanation or title.
The big take away here is that the aesthetic value of the rocket icon is really nice, but becomes just a pretty picture because it lacks metaphorical affordance in the current context of use.

But what if we only know generalized information about a group of users…?  What if our Q/A sessions haven’t produce enough behavioral information for selecting usable icons in our UI?

To understand how much liberty and the risks taken in UI without having
user persona data let’s take a look at a simple interface.

Spend about 5 seconds looking over the UI below…

image011

If you are someone that deciphered meaning from the icons inside the interface without an explanation, you’ve just created a small positive emotional response.  But if you could not, imagine the assumptions I just took in thinking any reader would be able to understand the interface above…

The idea that perceived usability of any interface is associated by how quickly we ascertain meaning puts weight on deciphering pictographic representation.

OK, Cool… So we pick icons everyone has some preconceived notion of what they represent and we’re good to go… Right?

In theory yes… The big take away here is that as soon as the complexity of an interface increases without having behavioral understanding, the metaphorical affordance degrades and the UI starts to function like a Norman Door.
Let’s take a look at more a complex Interface.
There are 24 visible icons surfaced in the UI Layer.

image013

There are some pictographic or icon problems to look for flagged in yellow:

  • Is the Home tab navigation option the same as the Home Button with icon?
  • If the icon next to the Lists title section is indicating that there is a list below, what does the icon do to the right of the Lists?
  • If the right List icon in the Lists section is for the functionality of sorting, why does the Table section not have the same List icon for sorting?
  • Is that a table icon or a calendar icon?
  • What is the difference between Reports and Forms, as they share the same icon?
  • What is the difference between the book Forms icon and the Forms icon in the top navigation area?

After looking over the dashboard above we come full circle and have the grounds to effectively understand the subtle psychological effects of icon degradation.
While it is true that it is possible to interact with the dashboard above, it is also true that the UI contains subtle pictographic inconsistencies that cause friction in the Usability.

Think about the subtle psychological effects of a Norman Door and how it still functions to open and close, but on some level it causes a negative impact on the overall user’s experience.

If the ultimate goal of a UI Practitioner is the least amount of friction possible than to render an experience with minor flaws hinders User Adoption on a measurable level.

“Simple is hard. Easy is harder. Invisible is hardest.” – Jean-Louis Gassée

In Part 2 of this blog series we will be discussing custom icons and a few extremely handy tools that aid in the creation process.  We will also be discussing the process of creating WebFonts and Symbol Contact Sheets, that can be used to help identify what character codes to use in your CSS.  We will then show the use of the created icons in Responsive Designed SharePoint Wiki project.

 

Advertisements

ReBranding The Harris Tweed Digital Experience – Bringing an Old Brand Into The New World

I love everything that has to do with Harris Tweed, so it was with great pleasure and old school inspiration that I tackled this project. The main design goal was to make something as traditional as the Harris Tweed Brand seem right in a modern User Experience. I went with a pretty basic UI layout using big blocks of single strip, 4 up, 2 column blocks. The look is very minimal so the focus is less on the page design and more on letting the grittiness and textures of the tweed seep into view. The bright colors of the buttons are meant to be over powering attracting the eye through out the UX almost like climbing spikes helping you navigate through the long scrolling page. The type has characteristics of a digital magazine layout with large titles covering classic Harris Tweed script, mixed with a Helvetica Condensed with only about 3 variations of smaller point sizes.

The page has persistent navigation that would scroll with the user, being a quick means to navigate up and down the long page style.

DESKTOP + TABLET + DIGITAL MAGAZINE:
HarrisTweed_Interface_Desktop_Final

PERSISTENT NAV
HarrisTweed_Interface-Persistent_Nav

MOBILE
HarrisTweed_InterfaceMobile

UPDATE Voice Command Running Jacket Prototype – Phase 2 – adding the voice command module

I have an update to my Voice Command Running Jacket Prototype.

The Arduino UNO finally has Phase 2 parts added.  The last time I posted about the project on Word Press I had the two 8×8 matrix displays setup to receive scrolling directions based on text input commands sent through the Serial Monitor.

WP_20140419_009WP_20140419_007
WP_20140419_012
WP_20140419_011

Building upon that code, I now have those functions that were being called on text input now setup to act inside a switch case block inside the event for voice capture and recognition.  The great part about starting the project off with text input through the Serial Monitor is that the Voice Command Module still operates through the serial port thus making the original code for this project very scalable.


CODE BREAK DOWN AND PITFALLS:
There are a couple of really good Classes/libraries that I am using in this project. The first is the Parola Library:
http://parola.codeplex.com/

This Library is a pretty robust scrolling library. The ability to scroll text, isn’t to hard to do from scratch, but the creators of this library have easing, inverted text and support for fonts to name a few.

* If you are going to use this library, there are a couple of pitfalls that you should keep in mind. The library was originally designed to work with a specific 8×8 LED matrix display, it has been updated to use any generic 8×8 but you have to update the .h file.

The next Library in the list is the SoftwareSerial, which actually is an internal Arduino Class that you import. Using this class helps assign RX and TX to other slots on the board.
*This is another pitfall area that I want to call out. All Arduino boards have dedicated TX and RX slots. The catch is the dedicated slots don’t allow interrupts back and forth. This for the most part was undocumented until I found an Ardunio forum page online that explains it in detail which slots on what boards do support interrupts.

HARDWARE BREAK DOWN AND PITFALLS:

The Voice Command Module is really easy to use,  It has 4 connection points ( GND, VCC, TX, RX) which fit as they should on the Arduino board and you can get up and running within 45min… If you have the information I have in the next two paragraphs… before you start…

* The catch here is that in order to interact with the Module you have to use a specific serial monitor app that ships with the device, which isn’t the best piece of software… and the Manual lacks some specific explanation as to the exact format of the HEX CODES the Voice Command Module needs inorder to set it up.  The Manual has a detailed table showing all the Commands ( 0x36 )but doesn’t express that the format  isn’t what is in the table but looks like this ( AAx36 ) instead…

*Also another pitfall that was undocumented was the act of “NOT” flipping the TX and RX connection points on the USB TTL adapter.  In almost every project that I have used on the Arduino you flip RX and TX, but for the recording and saving voice commands from your PC to the Voice Command Module, you don’t…  This took me 2 hrs of trial and error to find this bug.  This was really hard to find because the software is so bad, the issue looks as though it is your RX display settings.

All in all this project has been really fun, and shows the power of prototyping with Arduino and it’s plethora of compliant modules.

The previous video and older Blog post can be found here:
https://darkriderdesign.wordpress.com/2014/04/20/voice-command-running-jacket-prototype-arduino-mixed-with-voice-commands-and-scrolling-led-matrix/

 

HARDWARE PARTS LIST:

Arduino UNO:
https://www.sparkfun.com/products/11021

8×8 Matrix Displays:
http://www.ebay.com/itm/like/181221282262?lpid=82

Voice Command Module:
http://www.elechouse.com/elechouse/index.php?main_page=product_info&cPath=168_170&products_id=2151

 

 

If you like this comment, and let me know what you think.

 

cheers

Voice Command Running Jacket Prototype – Arduino Mixed With Voice Commands and Scrolling LED Matrix

Here is a fun project that I have been working on over the past month.

I have blogged about wearable technology a few times now, and I am happy to say that this Prototype is coming along nicely and would work very comfortably with a NIKE+ FUEL BAND.

The basic idea is that a runner could wear this “smart jacket” and it could be trained to respond to simple voice commands.  The runner could say “RIGHT” or “LEFT” or “EAT MY DUST”, and the jacket’s on board LED Matrix displays would act accordingly.

A possible design flaw, you could be the act of building this into a jacket vs building the apparatus that could be attached to any jacket… but for now… lets just say it is a jacket…

The over all parts list for building this prototype are fairly in expensive and could probably be achieved for under $100.  if you wanted a super cool Nike running jacket it could add to the cost 🙂

The Arduino compliant voice command shield is here.

http://www.ebay.com/itm/Voice-Recognition-Module-Arduino-Compatible-/261291298198?pt=LH_DefaultDomain_0&hash=item3cd62ccd96

 

WP_20140419_007

HERE ARE MY MOLE SKIN SKETCHES OF THE PROJECT:
WP_20140419_012 WP_20140419_011

WP_20140419_009

 

Let me know what you think, Comment, Respond.

 

Thanks,

Damon