Designing CSCW systems that support the widely varying needs of targeted users is difficult. There is no silver bullet technology that enables users to effectively collaborate with one another in different contexts. We propose a method of collaborative systems evaluation that enables novice evaluators to make insightful observations about the systems they evaluate at a level comparable to experts in certain situations. These observations come in the form of a categorical breakdown analysis of a laboratory study. The quantity and type of breakdowns can then be connected to recommended CSCW tools and features developed and described in the related literature. We conducted a study to explore the results generated when the method was applied by both experts and novices in the field of CSCW. We observed that experts found the method to be usable, and that novices capitalized on the knowledge embodied in the breakdown categories to make categorizations similar to those of experts.
Will Humphries, D. Scott McCrickard, Dennis C. Nea