Week 4 – Principles & Design

Required & recommended readings:http://www.columbia.edu/~ih2240/dataviz/dataviz_schedule.htm#topic4

Advertisements

6 thoughts on “Week 4 – Principles & Design

  1. arvi1000 02/11/2013 at 3:36 PM Reply

    In Chapter 4 of the Visual Display of Quantitative Information, Tufte sets out on a project not unlike Shneiderman’s in “The Eyes Have It” (last week’s reading). Both articles present a design principle and set of guidelines for statistical graphics, with the idea being not just to account for what distinguishes good and bad design, but to provide a practical reference for information designers.

    Last week (http://bit.ly/14OgmgD), I complained that the Shneiderman article lacked the force of argument. While his “1. overview, 2. zoom&filter, 3. details” paradigm seems like a reasonable suggestion (and 2048 people have cited it, according to google scholar), he doesn’t do much to say *why* that’s better than some other order, or to engage other arguments for competing principles. The closest he comes is the following excerpt:

    “There are many visual design guidelines but the basic principle might be: summarized as the Visual Information Seeking Mantra:
    Overview first, zoom and filter, then details-on-demand
    Overview first, zoom and filter, then details-on-demand
    Overview first, zoom and filter, then details-on-demand
    Overview first, zoom and filter, then details-on-demand
    Overview first, zoom and filter, then details-on-demand
    Overview first, zoom and filter, then details-on-demand
    Overview first, zoom and filter, then details-on-demand
    Overview first, zoom and filter, then details-on-demand
    Overview first, zoom and filter, then details-on-demand
    Overview first, zoom and filter, then details-on-demand

    Each line represents one project in which I found myself rediscoveriing this principle and therefore wrote it down it as a reminder.” (p. 337).

    This is (maybe) a cute visual gimmick, but if there are so many examples where his mantra is vindicated by experience, why doesn’t he talk about any in the paper? I rehash the Shneiderman issue because I think this week’s Tufte chapter is a great example of how you can propose a visual principle and set of guidelines, and then make a compelling argument by example.

    Tufte’s principle is “above all else show the data”, which gives rise a set of working guidelines: Maximize the data-ink ratio, Erase the non-data-ink, Erase redundant data-ink, Revise and edit. He takes several example visualizations, and shows iterative modifications to them based on the principles above, and the reader can plainly observe how these are improved. Futhermore, to his credit, he shows some examples where exceptions to his guidelines apply (like the map of ocean currents, where the redundancy of showing more than one whole globe is worthwhile because it allows for easier unbroken comparisons of adjacent areas).

    • arvi1000 02/12/2013 at 10:34 PM Reply

      Following up on this, here’s a good example of where someone should have applied the Tufte rules: http://graphics8.nytimes.com/images/2011/05/15/magazine/15-Leonhardt/15-Leonhardt-popup-v4.jpg

      We have not just an excess of gridlines, but some garish rainbow wallpaper to reinforce them!

      I think it’s a bad design because the colors introduce arbitrary distinctions and create groupings that aren’t there. For example, the colors imply orthodox christians and buddhists (same color block) are more of a kind than unitarians and buddhists (one yellow, one green). But in terms of proximity those pairs are about the same. Since all the information is contained in the position of the points, i think it would be better to leave it to the eye to notice proximity without the colors suggesting arbitrary clusters.

  2. Melissa Berry 02/11/2013 at 9:59 PM Reply

    The article for this week entitled “How NOT to Lie with Visualization” by Rogowitz and Trenish has a nice premise: keeping visualizations honest. They discuss that changes in color mappings, brightness, hue, etc. do not map linearly to perceived changes in these qualities. The reading also discussed they type of mapping one would choose to use for high and low density variation in data. I thought the first example with the MRI scan was fairly demonstrative of the point, but sadly I disagreed with the color mapping choices they made for almost every other example. I don’t know if it was that the examples they chose weren’t particularly clear or if I just wasn’t looking at the right aspects of the examples…

    Nonetheless I feel the point is well taken that a well designed visualization needs to have a color or grayscale mapping that people will perceive to change on the same scale the data changes.

  3. mey2111 02/17/2013 at 9:17 PM Reply

    The article “How Not to Lie with Visualization” calls attention to important issues in faithfully representing the data with which one is working. In addition to the problem of giving a false sense of ordering of nominal data, and in addition to issues of color, there is the additional — surprisingly frequent — problem of researchers presenting different graphical models of data when the graph’s scales are inconsistent, thus giving the impression of more marked variation than may in fact exist.

  4. Siwei 02/18/2013 at 8:44 PM Reply

    My understanding of data visualization was narrowed as putting data in the excel form into some program and using some code to plot some graph. However, the article “How NOT to Lie with Visualization” gives me many different kinds of visualization that I didn’t even considered as visualization. This paper discussed about how we can better represent the magnitude of a variable at every spatial position as well as the importance of the choice of colors that we have discussed in the second lecture. The various usage of data visualization require us to be very careful about the design that can let people best understand our purpose.

  5. hsssajx 02/19/2013 at 6:19 PM Reply

    Reviews and comments on: Evaluating Information Visualizations by S. Carpendale, 2008
    The paper discusses the challenges in evaluating the information visualization. The quality of information visualization is related to human computer interaction. Also, research prototypes do not truly resemble the real world cases. What’s more, the samples of users are at most times delimited to scholars or students. Besides those usability, perceptual and comprehensibility, the more important question for given visualized information is whether it enhances the understanding of or insight into the data. However, information processing tasks frequently various contextual, social and political factors. From the view of adaptive system, if it is the structure of human perception, the factors like non-linearity, holoarchy and internal-causality also implies that empirical research on human cognition towards visualization could be very difficult.
    S. Carpendale summarized the methodologies of social sciences by simplifying a previous methodologist’s work. The seemingly unpleasant fact is that among all the methods that current researchers employ—lab experiment, judgment study, sample survey, formal theory, computer simulation, field study, field experiment and experimental simulation–none of them could meet the three desired standards of research including precision, realism and generalizability. Among quantitative methods of those that have advantages in precision, they still have defects like forming the Type I or the Type II Errors, validity problem and the reliability problems. But by improving the sensitivity and skills in the data gathering process with the help of such, including the adoption of nested quantitative methods and heuristics and guidelines in quantitative research process, researchers could improve the quality of the data and thus improve the robustness of their results. Among qualitative methods of those that have advantages in realism, they also suffered from the problems like sample size (representative), subjectivity and inaccuracy. But similarly, contextual interview and participation observation techniques and other related techniques in qualitative methods could increase the level of research quality.
    The author does not mention any methodological integration and his ideas about the choosing of methods and techniques that could be applicable for the research of evaluating the quality of visualization. Since the nature of this kind of study is highly interdisciplinary, I propose researches to do some problem-based researches. We don’t have to sample evenly among all the people and don’t necessarily need to employ so many different methods. What I think is feasible is to start with some small scale samples and simply designed researches and to give more time and budget to collect the feedbacks during the relatively long time human machine interaction process and thus make improvements to the visualization tools. We don’t have to involve the social psychology theories and even neurosciences research progresses and give explanations to why we need to employ a very specific way of visualization. More functions and user-friendly characteristics could generally improve the performance of visualization software rather than giving a reason before implementing them.

    by gt2298~

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: