skip to main content
10.1145/2993901.2993917acmotherconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Using Concrete and Realistic Data in Evaluating Initial Visualization Designs

Published:24 October 2016Publication History

ABSTRACT

We explore means of designing and evaluating initial visualization ideas, with concrete and realistic data in cases where data is not readily available. Our approach is useful in exploring new domains and avenues for visualization, and contrasts other visualization work, which typically operate under the assumption that data has already been collected, and is ready to be visualized. We argue that it is sensible to understand data requirements and evaluate the potential value of visualization before devising means of automatic data collection. We base our exploration on three cases selected to span a range of factors, such as the role of the person doing the data collection and the type of instrumentation used. The three cases relate to visualizing sports, construction, and cooking domain data, and use primarily time-domain data and visualizations. For each case, we briefly describe the design case and problem, the manner in which we collected data, and the findings obtained from evaluations. Afterwards, we describe four factors of our data collection approach, and discuss potential outcomes from it.

References

  1. R. Amar and J. Stasko. A knowledge task-based framework for design and evaluation of information visualizations. In Information Visualization, IEEE Symposium on, pages 143--150. IEEE, Oct. 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. T. Boren and J. Ramey. Thinking aloud: Reconciling theory and practice. IEEE Transactions on Professional Communication, 43(3):261--278, Sept. 2000.Google ScholarGoogle ScholarCross RefCross Ref
  3. M. Brehmer, S. Carpendale, B. Lee, and M. Tory. Pre-design Empiricism for Information Visualization: Scenarios, Methods, and Challenges. In Proceedings of BELIV: Beyond Time And Errors --- Novel Evaluation Methods For Visualization, pages 147--151, New York, NY, USA, 2014. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. S. Carpendale. Evaluating Information Visualizations. In A. Kerren, J. T. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, number 4950 in Lecture Notes in Computer Science, pages 19--45. Springer Berlin Heidelberg, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. S. Carter and J. Mankoff. When Participants Do the Capturing: The Role of Media in Diary Studies. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 899--908, New York, NY, USA, 2005. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. E. K. Choe, N. B. Lee, B. Lee, W. Pratt, and J. A. Kientz. Understanding quantified-selfers' practices in collecting and exploring personal data. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems, pages 1143--1152. ACM, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. A. Cuttone, M. K. Petersen, and J. E. Larsen. Four Data Visualization Heuristics to Facilitate Reflection in Personal Informatics. In Universal Access in Human-Computer Interaction. Design for All and Accessibility Practice, pages 541--552. Springer, 2014.Google ScholarGoogle Scholar
  8. M. Czerwinski, E. Horvitz, and S. Wilhite. A Diary Study of Task Switching and Interruptions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 175--182, New York, NY, USA, 2004. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. B. Gaver, T. Dunne, and E. Pacenti. Design: Cultural Probes. Interactions, 6(1):21--29, Jan. 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. S. Greenberg, S. Carpendale, N. Marquardt, and B. Buxton. Sketching User Experiences: The Workbook. Elsevier, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. T. Herdal, J. G. Pedersen, and S. Knudsen. Designing Information Visualizations for elite soccer children's Different Levels of Comprehension. In Proceedings of the Nordic Conference on Human-computer Interaction (NordiCHI), New York, NY, USA, 2016. ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. D. Huang, M. Tory, B. A. Aseniero, L. Bartram, S. Bateman, S. Carpendale, A. Tang, and R. Woodbury. Personal Visualization and Personal Visual Analytics. IEEE TVCG, 21(3):420--433, 2015.Google ScholarGoogle Scholar
  13. B. Jackson, D. Coffey, L. Thorson, D. Schroeder, A. M. Ellingson, D. J. Nuckley, and D. F. Keefe. Toward Mixed Method Evaluations of Scientific Visualizations and Design Process As an Evaluation Tool. In Proceedings of BELIV: Beyond Time And Errors --- Novel Evaluation Methods For Visualization, pages 4:1--4:6, New York, NY, USA, 2012. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. J. Kjeldskov, M. B. Skov, and J. Stage. Instant Data Analysis: Conducting Usability Evaluations in a Day. In Proceedings of the Nordic Conference on Human-computer Interaction, pages 233--240, New York, NY, USA, 2004. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. S. Kvale. Doing Interviews. Qualitative Research Kit. SAGE Publications, Thousand Oaks, CA, 2008.Google ScholarGoogle Scholar
  16. H. Lam, E. Bertini, P. Isenberg, C. Plaisant, and S. Carpendale. Empirical Studies in Information Visualization: Seven Scenarios. IEEE TVCG, 18(9):1520--1536, Sept. 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. I. Li, A. Dey, and J. Forlizzi. A Stage-based Model of Personal Informatics Systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 557--566, New York, NY, USA, 2010. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. D. Lloyd and J. Dykes. Human-Centered Approaches in Geovisualization Design: Investigating Multiple Methods Through a Long-Term Case Study. IEEE TVCG, 17(12):2498--2507, Dec. 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. M. Monroe. Classic Techniques in New Domains: An Alternative Recipe. In E. Bertini, N. Elmqvist, and T. Wischgoll, editors, EuroVis 2016 - Short Papers. The Eurographics Association, 2016.Google ScholarGoogle Scholar
  20. Z. Pousman, J. Stasko, and M. Mateas. Casual Information Visualization: Depictions of Data in Everyday Life. IEEE TVCG, 13(6):1145--1152, Nov. 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. A. J. Pretorius and J. J. V. Wijk. What Does the User Want to See? What do the Data Want to Be? Information Visualization, 8(3):153--166, Sept. 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. J. C. Roberts, C. Headleand, and P. D. Ritsos. Sketching Designs Using the Five Design-Sheet Methodology. IEEE TVCG, 22(1):419--428, Jan. 2016.Google ScholarGoogle Scholar
  23. P. Saraiya, C. North, and K. Duca. An insight-based methodology for evaluating bioinformatics visualizations. IEEE TVCG, 11(4):443--456, July 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. M. Sedlmair, M. Meyer, and T. Munzner. Design study methodology: Reflections from the trenches and the stacks. IEEE TVCG, 18(12):2431--40, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. C. Snyder. Paper Prototyping. Morgan Kaufmann, 1st edition, 2003.Google ScholarGoogle Scholar
  26. A. Strauss and J. Corbin. Basics of Qualitative Research. Techniques and Procedures for Developing Grounded Theory. SAGE Publications, Thousand Oaks, CA, 2. ed. edition, 1998.Google ScholarGoogle Scholar
  27. M. Tohidi, W. Buxton, R. Baecker, and A. Sellen. Getting the Right Design and the Design Right: Testing Many Is Better Than One. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 1243--1252, New York, NY, USA, 2006. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. J. S. Yi, Y. A. Kang, J. T. Stasko, and J. A. Jacko. Understanding and characterizing insights: How do people gain insights using information visualization? In Proceedings of BELIV: Beyond Time And Errors --- Novel Evaluation Methods For Visualization, page 4, New York, NY, USA, 2008. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  1. Using Concrete and Realistic Data in Evaluating Initial Visualization Designs

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      BELIV '16: Proceedings of the Sixth Workshop on Beyond Time and Errors on Novel Evaluation Methods for Visualization
      October 2016
      177 pages
      ISBN:9781450348188
      DOI:10.1145/2993901

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 24 October 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate45of64submissions,70%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader