ABSTRACT
We explore means of designing and evaluating initial visualization ideas, with concrete and realistic data in cases where data is not readily available. Our approach is useful in exploring new domains and avenues for visualization, and contrasts other visualization work, which typically operate under the assumption that data has already been collected, and is ready to be visualized. We argue that it is sensible to understand data requirements and evaluate the potential value of visualization before devising means of automatic data collection. We base our exploration on three cases selected to span a range of factors, such as the role of the person doing the data collection and the type of instrumentation used. The three cases relate to visualizing sports, construction, and cooking domain data, and use primarily time-domain data and visualizations. For each case, we briefly describe the design case and problem, the manner in which we collected data, and the findings obtained from evaluations. Afterwards, we describe four factors of our data collection approach, and discuss potential outcomes from it.
- R. Amar and J. Stasko. A knowledge task-based framework for design and evaluation of information visualizations. In Information Visualization, IEEE Symposium on, pages 143--150. IEEE, Oct. 2004. Google ScholarDigital Library
- T. Boren and J. Ramey. Thinking aloud: Reconciling theory and practice. IEEE Transactions on Professional Communication, 43(3):261--278, Sept. 2000.Google ScholarCross Ref
- M. Brehmer, S. Carpendale, B. Lee, and M. Tory. Pre-design Empiricism for Information Visualization: Scenarios, Methods, and Challenges. In Proceedings of BELIV: Beyond Time And Errors --- Novel Evaluation Methods For Visualization, pages 147--151, New York, NY, USA, 2014. ACM. Google ScholarDigital Library
- S. Carpendale. Evaluating Information Visualizations. In A. Kerren, J. T. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, number 4950 in Lecture Notes in Computer Science, pages 19--45. Springer Berlin Heidelberg, 2008. Google ScholarDigital Library
- S. Carter and J. Mankoff. When Participants Do the Capturing: The Role of Media in Diary Studies. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 899--908, New York, NY, USA, 2005. ACM. Google ScholarDigital Library
- E. K. Choe, N. B. Lee, B. Lee, W. Pratt, and J. A. Kientz. Understanding quantified-selfers' practices in collecting and exploring personal data. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems, pages 1143--1152. ACM, 2014. Google ScholarDigital Library
- A. Cuttone, M. K. Petersen, and J. E. Larsen. Four Data Visualization Heuristics to Facilitate Reflection in Personal Informatics. In Universal Access in Human-Computer Interaction. Design for All and Accessibility Practice, pages 541--552. Springer, 2014.Google Scholar
- M. Czerwinski, E. Horvitz, and S. Wilhite. A Diary Study of Task Switching and Interruptions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 175--182, New York, NY, USA, 2004. ACM. Google ScholarDigital Library
- B. Gaver, T. Dunne, and E. Pacenti. Design: Cultural Probes. Interactions, 6(1):21--29, Jan. 1999. Google ScholarDigital Library
- S. Greenberg, S. Carpendale, N. Marquardt, and B. Buxton. Sketching User Experiences: The Workbook. Elsevier, 2011. Google ScholarDigital Library
- T. Herdal, J. G. Pedersen, and S. Knudsen. Designing Information Visualizations for elite soccer children's Different Levels of Comprehension. In Proceedings of the Nordic Conference on Human-computer Interaction (NordiCHI), New York, NY, USA, 2016. ACM.Google ScholarDigital Library
- D. Huang, M. Tory, B. A. Aseniero, L. Bartram, S. Bateman, S. Carpendale, A. Tang, and R. Woodbury. Personal Visualization and Personal Visual Analytics. IEEE TVCG, 21(3):420--433, 2015.Google Scholar
- B. Jackson, D. Coffey, L. Thorson, D. Schroeder, A. M. Ellingson, D. J. Nuckley, and D. F. Keefe. Toward Mixed Method Evaluations of Scientific Visualizations and Design Process As an Evaluation Tool. In Proceedings of BELIV: Beyond Time And Errors --- Novel Evaluation Methods For Visualization, pages 4:1--4:6, New York, NY, USA, 2012. ACM. Google ScholarDigital Library
- J. Kjeldskov, M. B. Skov, and J. Stage. Instant Data Analysis: Conducting Usability Evaluations in a Day. In Proceedings of the Nordic Conference on Human-computer Interaction, pages 233--240, New York, NY, USA, 2004. ACM. Google ScholarDigital Library
- S. Kvale. Doing Interviews. Qualitative Research Kit. SAGE Publications, Thousand Oaks, CA, 2008.Google Scholar
- H. Lam, E. Bertini, P. Isenberg, C. Plaisant, and S. Carpendale. Empirical Studies in Information Visualization: Seven Scenarios. IEEE TVCG, 18(9):1520--1536, Sept. 2012. Google ScholarDigital Library
- I. Li, A. Dey, and J. Forlizzi. A Stage-based Model of Personal Informatics Systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 557--566, New York, NY, USA, 2010. ACM. Google ScholarDigital Library
- D. Lloyd and J. Dykes. Human-Centered Approaches in Geovisualization Design: Investigating Multiple Methods Through a Long-Term Case Study. IEEE TVCG, 17(12):2498--2507, Dec. 2011. Google ScholarDigital Library
- M. Monroe. Classic Techniques in New Domains: An Alternative Recipe. In E. Bertini, N. Elmqvist, and T. Wischgoll, editors, EuroVis 2016 - Short Papers. The Eurographics Association, 2016.Google Scholar
- Z. Pousman, J. Stasko, and M. Mateas. Casual Information Visualization: Depictions of Data in Everyday Life. IEEE TVCG, 13(6):1145--1152, Nov. 2007. Google ScholarDigital Library
- A. J. Pretorius and J. J. V. Wijk. What Does the User Want to See? What do the Data Want to Be? Information Visualization, 8(3):153--166, Sept. 2009. Google ScholarDigital Library
- J. C. Roberts, C. Headleand, and P. D. Ritsos. Sketching Designs Using the Five Design-Sheet Methodology. IEEE TVCG, 22(1):419--428, Jan. 2016.Google Scholar
- P. Saraiya, C. North, and K. Duca. An insight-based methodology for evaluating bioinformatics visualizations. IEEE TVCG, 11(4):443--456, July 2005. Google ScholarDigital Library
- M. Sedlmair, M. Meyer, and T. Munzner. Design study methodology: Reflections from the trenches and the stacks. IEEE TVCG, 18(12):2431--40, 2012. Google ScholarDigital Library
- C. Snyder. Paper Prototyping. Morgan Kaufmann, 1st edition, 2003.Google Scholar
- A. Strauss and J. Corbin. Basics of Qualitative Research. Techniques and Procedures for Developing Grounded Theory. SAGE Publications, Thousand Oaks, CA, 2. ed. edition, 1998.Google Scholar
- M. Tohidi, W. Buxton, R. Baecker, and A. Sellen. Getting the Right Design and the Design Right: Testing Many Is Better Than One. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 1243--1252, New York, NY, USA, 2006. ACM. Google ScholarDigital Library
- J. S. Yi, Y. A. Kang, J. T. Stasko, and J. A. Jacko. Understanding and characterizing insights: How do people gain insights using information visualization? In Proceedings of BELIV: Beyond Time And Errors --- Novel Evaluation Methods For Visualization, page 4, New York, NY, USA, 2008. ACM. Google ScholarDigital Library
- Using Concrete and Realistic Data in Evaluating Initial Visualization Designs
Recommendations
Evaluating Climate Visualization: An Information Visualization Approach
IV '10: Proceedings of the 2010 14th International Conference Information VisualisationTo meet the growing demand of communicating climate science and policy research, the interdisciplinary field of climate visualization has increasingly extended its traditional use of 2D representations and techniques from the field of scientific ...
Affective and Effective Visualisation: Communicating Science to Non-expert Users
PACIFICVIS '14: Proceedings of the 2014 IEEE Pacific Visualization SymposiumThis paper outlines Non-Expert User Visualisation (NEUVis), a mode of information visualisation commonly practiced by artists and designers. NEUVis, a wicked problem, accounts for design constraints related to the affective (or emotional) response by ...
Evaluating X3D for use in software visualization
SoftVis '06: Proceedings of the 2006 ACM symposium on Software visualizationThere are many technologies that have varying capabilities that could be used to help understand software through visualizations. Determining which technology is best suited for the development and delivery of a particular type of software visualization ...
Comments