Do you want to deliver the value that your audience needs to receive?

(note:  the following reprinted with permission from our colleague Adam Groves; You can read more of his thoughts on travel, design, and life at http://www.aseasonofwonderlust.blogspot.com)

“Discover the real problems.” – Don Norman, The Design of Everyday Things

Guiding principles of user research:
– You are not the user
– Keep an open mind
– University of Minnesota, User Research and Design course, Brent Hecht

Do you want to deliver the value that your audience needs to receive? ​It is a design thinking question, and I pose it to anyone (myself included) who wants to contribute to the world. “Delivering value” applies to many of our situations. You might want to manufacture a product, give a speech, provide a humanitarian service, or participate in a relationship. Your “audience” might be a user base, a room full of business associates, a stranger on the street or a life partner. In all cases, the basic concerns remain the same.

Originally I intended to preach about designing your deliverable to meet the goals and understanding of the people who use it. However, I am relatively new to the world of design and don’t have the experience or authority to start handing out road maps just yet. What I have is growing awareness of and excitement about the landscape of this discipline. Its insights have broadened my perspective and creative output, and so I can speak with enthusiasm about the topic given in the title, as a migrant to this land inviting others to join me.

Our starting point is not how to design, but why we should care about design. Our audience benefits – we benefit – if we embrace that our audience’s goals may be different than ours, and that delivering something of value to them takes a special attitude and special effort.

This line of thought emerged from a get-together with friends to discuss active projects and areas of study. My friends are scientist-artists, teacher-students, innovators and experimenters. I learn from them about geology, environmental data science, clay-working, glass-blowing, lego-modeling, programming, and whatever else captivates one or more of us. I am a discipline-hopper as well, and one thing I am passionate about is learning how people identify, describe, and address human needs. That day, I became involved in a conversation about the user-friendliness of a data gathering tool, and ways that users might employ or abuse the tool. I agreed to give this further thought, and to write about the question from a design perspective.

This essay is the result. The direct connection between that conversation and this prose delights me. It is a simple example of how much of life is entangled with other parts of life, and how the act of communication enriches that entanglement and influences what we are capable of offering. Information and energy moves through us. The world around us literally inspires us, shapes us and our perspectives. Communication connects us, constantly, with people and objects and ideas.

To the point, communication is fundamental to design thinking, and to the challenge of caring about your audience and whether they are getting what they need from you. Communication has provided me, in writing this, with a direct connection to that conversation with friends, whose needs this essay hopefully, eventually, serves.

I say eventually because delivering value to meet a need is not a one-and-done arrangement. It usually takes more than one try. Perhaps some of my terminology is too fancy, or else too vague. Do I need to add visuals? Remove some digressions? The best case scenario: I understand their need but I am not quite providing the tool they need to address it. The worst case scenario: I have misunderstood their need completely. If I care about meeting that need, then either case is okay.

Good design works through iteration and feedback. Anne Lamott talks in Bird by Bird talks about “shitty first drafts,” Adam Grant urges us in Originals to reject perfectionism and share our incomplete ideas, and Mr. Norman, along with many, many designers, offers the “failing fast” model, which is not really about failure at all, but rather progressing by steps with your audience/user/partner toward the goal.

A regular exchange is the least that I am prepared to offer, since I want to produce something of actual value. Communication demonstrates that I care, and it even reinforces my ability to care by granting me additional insight and connection to my audience.

Design thinking is one way to increase our awareness of, and sensitivity to, our audience’s needs. Design thinking provides us with tools to gather information and act wisely on that information to create the value that they need. As I already said, I won’t try to educate you about those tools right now. I will, however, talk about a couple of common pitfalls in the process, because they interfere with our ability to care about our audience. Remember, these principles are broadly relevant, not just to manufacturing (or programming) as we typically think of design.

First, we must be careful not to become protective of a specific idea or draft or model at the expense of our users. This bad habit damages our ability to deliver the value that our audience needs. It damages our desire to care! Feedback becomes a threat when our ego is involved.

I am sure that everyone can remember a company initiative, or school project, or friendly engagement, that devolved into a power struggle or a misguided conviction that someone “knew” what was best for everyone and that everyone would eventually be grateful for their stubbornness (or at least admit they were wrong). I have been guilty of just this sort of stubbornness. But was any need satisfied in the end, other than my need to feel right when I looked in the mirror?

Ironically, as we shrink away from audience response, we destroy the value that we meant to protect. It is the successful fulfillment of their needs that mattered in the first place. Without it, what is the thing we make worth? Even worse, insensitivity to the needs of the audience can lead to dysfunction and outright pathology.

In Things That Make Us Smart, Donald Norman quotes Grudin’s Law: “When those who benefit are not those who do the work, then the technology is likely to fail or, at least, be subverted.” (Norman 113). Those who benefit means those who decided how the product is designed – for their own gain, not the audience’s – and those who do the work stands for whomever that audience is – whomever must make use of the product. I suggest that we increasingly witness, in modern life, not just products that fail or are subverted, but populations that fail or are subverted because of the products, systems, ideas they are given to work with.

Second, we must remember that our understanding is not our audience’s understanding. Even if you properly understand their need and work to serve it, you might not really understand how they will engage your product to meet it. Again in Things That Make Us Smart, Mr. Norman refers to “technology that is imposed upon us on its terms instead of ours” (Norman 103). Don’t put your audience in your shoes – put yourself in their shoes. Bring the tool to them, instead of the other way around.

Once more, communication is key. It builds awareness as well as empathy, and helps us deliver something that our audience can comprehend and use. What works? What doesn’t? How do they interpret a particular instruction or feature? Design experts offer a wide variety of strategies to participate in and understand our audience’s experience. Seeking diversity is a very valuable one.

In Originals, Mr. Grant references research that connects creativity with multicultural exposure. The longer and more different, in fact, the better. Mr. Norman reflects on the very different meanings that similar design features may have in different cultures. A very basic observation, then, is to know your audience (and it might not hurt to know some people who are not your audience, for comparison and to inspire out-of-the-box thinking). What languages do your users speak? What age groups are they in? What economic classes? What geographies do they live in? What sorts of jobs do they have? Questions about just how you begin assembling that data are for another time. I only mean to underline our main concerns today: caring and communicating.

Part of caring and communicating is being sensitive to the usability question. If your audience only understands Dutch and you speak to them in Spanish, have you delivered any value to them? If you make a retirement planning website but its information is presented in small, faint type, how long will your aging customers be able to use it? Have you designed a new type of scissors? Will left-handed people be able to cut with them? In one respect, it is up to you to decide who your audience is, but from another point of view, your audience decides itself, and as you can see the tools themselves sometimes shape the audience, for good or ill.

Do you want to deliver the value that your audience needs to receive? ​If we want what we offer to matter, we must first care about the needs of the people to whom we are offering. If we wish for them to succeed in meeting their needs, we must care about how they experience the world and, specifically, how they experience what we are offering.

How do we proceed from this point? For my part, I hope to expand my awareness of human experience and needs. I want to participate in fruitful discussion on design, communication, and many other intersecting fields. The proper foundation comes first and then, I hope, inspiration regarding specific ways to engage and provide help to the world.

It excites me to think of all the ways we might nurture attentiveness to our audiences. Lately I have explored design, user experience, information architecture, theories on effecting change, coaching and leadership advice, sociology, mathematics, religion, international participation, and so on. What interests you? What would you add to this list? For each of us, some realms are more suitable than others, and that is okay, too: it is all part of the communication texture.

All realms of knowledge allow us to trace paths between people and experiences, between disciplines, and as the interconnected landscape unfolds before us, we also unfold to one another and ourselves. Design is one way of discovering, and working to enhance, a communication that constantly occurs between all areas of life.

This essay began as a conversation with a few friends. I hope it continues and grows, and perhaps even expands into a larger conversation with more people, more points of view.

Works Cited
Grant, Adam. Originals, Viking, 2016.

Hecht, Brent, Konstan, Joseph A, Terveen, Loren, Yarosh, Lana, Zhu, Haiyi. User Experience: Research and Prototyping. University of Minnesota, 2016. https://www.coursera.org/learn/design-research, Accessed 12 Dec, 2016.

Lamott, Anne. Bird by Bird, 1994. Anchor Books, 1995.

Norman, Donald. The Design of Everyday Things, 1988. Basic Books, 2013.

–. Things That Make Us Smart. Addison Wesley, 1993.

If you wish to learn more about these concepts, here are a few recommendations:

The Design of Everyday Things, by Donald Norman, a sort of bible of design and user experience fundamentals from which you may learn first-hand a lot of insights and best practices that others have built upon.

Things That Make Us Smart, also by Donald Norman. This book is perhaps less well-known than Design…, but I find it valuable for its attention to ways we misuse tools or mis-design to the detriment of our audiences.

Originals, by Adam Grant. The book is about nurturing creativity. It overlaps significantly with the design perspective but works on its own as well.

The Information, by James Gleick, which surveys the data-saturated landscape in which we live and think and speak and manufacture.

Switch: How to Change Things When Change is Hard, by Chip Heath and Dan Heath. This book concerns how to effect worthwhile change, in your private life, a society, a business, or elsewhere, and with varying levels of formal authority at your disposal.

Why Things Bite Back, by Edward Tenner. Here is an interesting exploration of unintended consequences to technological developments. This is a sort of sideways entry point into the ethics of design.

Big Data in the Geosciences: 4

We are inundated with environmental data – Earth observing satellites stream terabytes of data back to us daily; ground-based sensor networks track weather, water quality and air pollution, taking readings every few minutes; and community scientists log hundreds and thousands of observations every day, recording everything from bird sightings to road closures and accidents. But this very richness of data has created a new set of problems.
This last post in our four-part series gives a brief summary of the data skills that geoscientists will need to develop to effectively work with data in a data-rich, connected, open-source world. This report is loosely based on the town halls and open-source sessions, as well as the more formal Earth and Space Science Informatics sessions, at the AGU fall meeting in Dec 2016.

Data Skills
Twenty-first century science is marked by the availability of huge environmental datasets, unprecedented access to computing power, and an urgent need to understand – and mitigate – the increasing impact of human society on the environment. What skills do geoscientists need to face the challenges and opportunities of 21st century science? We describe three areas that have the potential to leverage today’s data and computing power to meet our current environmental challenges.

Continue reading

Big Data in the Geosciences: 3

We are inundated with environmental data – Earth observing satellites stream terabytes of data back to us daily; ground-based sensor networks track weather, water quality and air pollution, taking readings every few minutes; and community scientists log hundreds and thousands of observations every day, recording everything from bird sightings to road closures and accidents. But this very richness of data has created a new set of problems.
This third post in our four-part series gives a brief summary of how deep learning is being used in the geosciences today – loosely based on the Earth and Space Science Informatics sessions and town halls at the AGU fall meeting in Dec 2016.

Deep learning
Artificial Neural Networks (ANNs) are already being widely used in domains ranging from stock price predictions to image recognition; from genetic sequencing to targeted marketing. Deep learning neural networks – that is, networks that have multiple layers of neurons between the input and output neurons – are also beginning to be used in the Geosciences to address a range of problems.

Continue reading

Cloudiness Trends for the 2017 Solar Eclipse

Planning an excursion to see the upcoming solar eclipse? NASA can help with that! They provide two sets of data which can point you to good viewing:

A little bit of R scripting lets us combine these and put them onto a Leaflet map of the US.

Downloads

Before coding, there are two things to download:

Displaying weather data from the Global Historical Climatology Network (GHCN)

In our previous blog posts, we downloaded and analyzed the GHCN weather data. That leads us to the next step:  Displaying the data!

Our goal is to display the climate change data so that the regional trends are clearly visible as well as the local weather history underpinning those trends. In order to do this we decided on the following graphics components:

  • displaying data on a map, both in the form of point locations (the weather stations) and bitmap overlays (regional weather trends). The Leaflet library of JavaScript routines handles this part.
  • displaying graphs of the trends in local weather data. The D3.js library handles this.
  • and displaying the data history for a given weather station as a heatmap. We draw directly to
    Continue reading

R: Analyzing weather data from the Global Historical Climatology Network (GHCN)

1. Introduction
The R code below can be used to extract some weather metrics such as maximum daily temperature, minimum daily temperature, average or total daily rainfall and other annual metrics from the GHCN weather data set. This code assumes that you have already created a dataframe of the GHCN stations of interest to you. For example, the set of GHCN stations of interest in this exercise consists of the 520 stations within the US that have data for the 80 years from 1936–2015, with less than 2% missing data (see “R: Reading & Filtering GHCN weather data” on how this set was created). This dataframe (stn80 in our case) with the stations of interest should include, at a minimum, the station ID, LAT, LON (the LAT & LON are useful for mapping the metrics).

The GHCN weather data has one data file for each station. The station data file from GHCN has the following format:
Note: the GHCN station datafiles were converted from a fixed width format to a comma separated format.

head(USC00010252)
  X.1  X          ID year month element Val1 Val2 Val3 Val4 Val5 Val6 Val7 Val8 Val9 Val10 Val11 Val12 Val13 Val14 Val15 Val16 Val17 Val18 Val19 Val20 Val21 Val22 Val23 Val24 Val25 Val26 Val27 Val28 Val29 Val30 Val31
1  42 42 USC00010252 1938     1    TMAX   NA   NA   NA   NA   NA   NA   NA   NA   NA    NA   244   256   239   233   222   194   189   233   228   239   239   239   250   250   200    67    67    94   178   233   183
2  43 43 USC00010252 1938     1    TMIN   NA   NA   NA   NA   NA   NA   NA   NA   NA    NA    67    94   111   111   117   144   144   167   183   183   139   122   150   139    44   -72   -67   -17   -17    78    56
3  45 45 USC00010252 1938     1    PRCP    0   64   25    0    0   89  191    0    0     0     0    15     0     5     0     0     0    23     0     0     0     0     0   203     0     0     0     0     0     0    41
4  48 48 USC00010252 1938     2    TMAX  172  161  211  233  261  256  256  256  250   261   256   239   256   233   261   233   233   239   233   128   261   261   178   128   117   211   233   206    NA    NA    NA
5  49 49 USC00010252 1938     2    TMIN  -28   33   83   50   78  156   89   94  106    83    94   100    83   122    89   133    78   128    56     6    78   133   111    33     6     0    44    78    NA    NA    NA
6  51 51 USC00010252 1938     2    PRCP    0    0    0    0    0    0    0    0    0     0     0     0     0     0     0     0     0     3   686     0     0     0   114     0     0     0     0     0    NA    NA    NA

Note: TMAX and TMIN are in tenths of degree Celsius, so 172 is 17.2C
We will manipulate these station data files in R to create several different metrics and write them to their own output files.
Continue reading

R: Reading & Filtering weather data from the Global Historical Climatology Network (GHCN)

1. Introduction
The GHCN weather data set is a great starting point for exploring trends in global or regional weather patterns. It is a publicly available, observational, dataset that has daily and monthly summaries of weather variables, including high temperature, low temperature and precipitation. The weather data is available for thousands of stations world-wide; and for many of these stations the weather records stretch back over a century. In this blog post, we describe how to:

  • read in this fixed-width dataset into R;
  • use the metadata information to create a subset of weather stations in the US with data from 1936-2015;
  • determine percentage of missing data for each station;

thus creating a list of weather stations in the US with 98% coverage of the weather variables TMAX, TMIN, and PRCP for the 80-year period from 1936 to 2015.

Continue reading

Big Data in the Geosciences: 2

We are inundated with environmental data – Earth observing satellites stream terabytes of data back to us daily; ground-based sensor networks track weather, water quality, and air pollution, taking readings every few minutes; and community scientists log hundreds and thousands of observations every day, recording everything from bird sightings to road closures and accidents. But this very richness of data has created a new set of problems.
This second post in our four-part series gives a high-level view of the challenges of portraying and communicating big data in the geosciences – and how these challenges are being addressed – loosely based on the Earth and Space Science Informatics sessions and town halls at the AGU fall meeting in Dec 2016.

Data Visualization
One of the challenges facing geoscientists is simply how to wrangle meaning from big data and effectively communicate their findings to other interested scientists, communities, students, planners or policy-makers. Big data is challenging as it can have a large number of variables with complex, non-linear relationships among them. Scientists are turning to data visualization – which leverages the incredible pattern-recognition power of the human eye – to design graphics that effectively convey complex information.

Continue reading

Big Data in the Geosciences: 1

We are inundated with environmental data – Earth observing satellites stream terabytes of data back to us daily; ground-based sensor networks track weather, water quality, and air pollution, taking readings every few minutes; and community scientists log hundreds and thousands of observations every day, recording everything from bird sightings to road closures and accidents.  But this very richness of data has created a new set of problems.
This four-part post gives a high-level view of some of the challenges of big data in the geosciences – and how they might be solved – loosely based on the Earth and Space Science Informatics sessions and town halls at the AGU fall meeting in Dec 2016.

Data discovery
With so much environmental data, looking for a specific dataset for a research project can sometimes feel like looking for a needle in a haystack. How can data discovery, that is, finding the right dataset or sharing one’s own dataset with the larger research community, be made more efficient?

Continue reading

R: An introductory regression exercise

In this example, we will complete a linear regression in R using mtcars, one of the built-in R datasets.

1. help or ?
First, let us get acquainted with the mtcars dataset.
To do so, we look at the package description. We can do this by either using ‘?mtcars’ or ‘help(mtcars)’.
Note: this will print in the Help window, and has been pasted in to the Notebook for completeness.

help(mtcars)

mtcars {datasets}
R Documentation
Motor Trend Car Road Tests
Description

The data was extracted from the 1974 Motor Trend US magazine, and comprises fuel consumption and 10 aspects of automobile design and performance for 32 automobiles (1973–74 models).

Usage

mtcars
Continue reading