Michael gave me the OK to change my research focus after my internship at Disney Research Pittsburgh. Before, it was a focus in embodied autonomous agents and the possible authoring tools to support them, with a minor in craft authoring support tools. Now, I think the focus will be craft tools with a minor in the work I’ve done with ABL and assessing agent AI authoring tools. The title he came up with was Authoring and Debugging Tools for Domain-Specific Expressive Languages. Now, I’d probably rephrase it to be something like “Creativity Support Tools for Domain-Specific Expressive Languages” or “Creativity Support Tools: Case Studies in Two Domains.” Maybe if I find some grand lesson from the two experiences, it’d be like “Creativity Support Tools: <insert lesson here>.”
In order to address my new focus, I’ve started a whole new literature review. It’s helped me come at this problem from a whole new (broader) perspective. I’ve got about 15 new books on the way, as well as a big stack of papers I’ve been tackling. I have been really surprised by A) how much overlap in good tool design and good game design (especially of open-world or simulation-based games) there is and B) the general needs of these tools matched the ABL-specific needs we came up with previously. Also as a bonus C) how much good tool design overlaps with experimental teaching/learning approaches. That shouldn’t be surprising, given how games often use tutorials, and how creativity (and their supportive tools) involve both domain-specific information, encoding their best practices, as well as extra support mechanisms for enabling the domain-specific artifact creation/dissemination.
The first major reading I’ve finished going through is the Creativity Support Tools workshop report from the NSF conducted in June 2005. There is a LOT of rich summary information of past research and proposals of open problems, which still sound relevant even 10 years later.
What is Creativity? How do you ‘Support’ it?
“Basically, creativity can be considered to be the development of a novel product that has some value to the individual and to a social group” [NSF page 10, Creativity Support Tool Evaluation Methods and Metrics]. In this and many other definitions, the concepts of novelty and value consistently reappear: that the created thing must be ‘new’ in some way (not formally known/explored) and useful/important/worthwhile by some metric. Boden  draws a distinction between P-creative (personally novel, relatively common, since we do not know everything about our world) and H-creative (novel to the human race or culture as a whole, relatively rare). Measuring value, however, is extremely muddy, as the concept of value (/useful/important/worthwhile) is different between those synonyms, as well as between individuals, groups, and cultures.
Csikszentmihalyi’s work on Creativity defines the key components as:
Domain: “consists of a set of symbols, rules, and procedures” (like math or biology)
Field: “the individuals who act as gatekeepers to the domain… decide whether a new idea, performance, or product should be included” (peer-review and validation)Individual: “when a person… has a new idea or sees a new pattern, and when this novelty is selected by the appropriate field for inclusion in the relevant domain” (like paper publication! or publishing a pattern book! or just having other people use your tool)
“The psychological literature provides no clear, unequivocal answer to whether or not creativity can be enhanced. There are many different variables that have been proposed as having a role, including individual abilities, interests, attitudes, motivation, intelligence, knowledge, skills, beliefs, values, and cognitive styles. Thus it seems that individual, social, societal, and cultural differences and factors may all matter, at some time or another and under some circumstance or another” [NSF page 13, Creativity Support Tool Evaluation Methods and Metrics]. Basically, there are too many interconnected variables involved in the squishy process of creativity for us to isolate/examine it as a phenomenon. Thus, we can’t really test to see if creativity is “happening” or to what degree, nor can we really measure how creative someone is (other than by their potentially creative output, which can be anything from ideas to artifacts).
However, people sometimes have a feel for the ‘flow of creative juices’ or when their creative process is interrupted. Some people have a particular mindset or environment in which they best ‘work their magic’. Still, at other times inspiration ‘strikes without warning’ or ‘can come from anywhere’. These are all phrases I’ve colloquially heard when people discuss their creative processes. Creativity is undeniably a force that can be helped or hindered, whether the creator is aware of those forces and their state of mind or not.
Without being able to measure or predict creativity, we can still attempt to improve it. Nickerson  has included factors involve in teaching creativity, such as: “Build basic skills; Encourage acquisition of domain-specific knowledge; Stimulate and reward curiosity and exploration; Build motivation; Encourage confidence and risk-taking; Focus on mastery and self-competition; Provide opportunities for choice and discovery; and Develop self-management (meta-cognitive) skills” (my emphasis) [NSF page 14, Creativity Support Tool Evaluation Methods and Metrics]. We need tools to help with these tasks, which help improve the personal experience of the one creating, improve the outcomes and artifacts, and help domain-specific process challenges.
A quick definition given of Creativity Support Tools (CSTs): “…tools that enable people to express themselves creatively and to develop as creative thinkers… software and user interfaces that empower users to be not only more productive, but more innovative… These advanced interfaces should also provide potent support in hypothesis formation, speedier evaluation of alternatives, improved understanding through visualization, and better dissemination of results” [NSF page 25, Design Principles for Tools to Support Creative Thinking].
One of the most fundamental (and repeated) examples of a creativity support tool is the pencil and paper (or whiteboard). It’s easy to use, fast to sketch ideas, and easily visualized (especially if you draw pictures rather than words). Other common examples are the telescope and sewing machine (likely due to Ben Shneiderman, as he’s used the examples previously).
Design Considerations/Principles/Criteria for Creativity Support
Consensus from the workshop gathered around the concepts of “low thresholds (easy entry to usage for novices), high ceilings (powerful facilities for sophisticated users), and wide walls (a small, well-chosen set of features that support a wide range of possibilities)” (my emphasis) [NSF page2]. If this isn’t a summary for good open-world/simulation game design, I don’t know what is! Just think of Minecraft. The fundamental features (harvesting/creating/placing voxels), easy entry (WASD/space/mouse controls, easy-to-understand metaphors and operations), and limitless possibilities. Honestly, the barrier to entry (figuring out all the block/item combinations) is the one I find most lacking in Minecraft, but that hasn’t stopped little kids from figuring it out. In tool-creation, though, this is a troublesome set of requirements. Tools restricted enough for the novice are often too restricted (or toy-ish) for experts. Everyone also learns at their own pace… just look at the utter failure that was Mario Maker‘s tool unlock plan (where it used to be mandatory to play the game over 9 days to unlock all the level editing tools, until Nintendo released a day-1 patch offering an alternative unlock plan).
Trying not to get side-tracked… Candy, Edmonds, and Hewett have tried to understand the functional requirements and design criteria. In the report, they state: “…any Creativity Support Tool should allow the user: to take an holistic view of the source data or raw material with which their work; to suspend judgement on any matter at any time and be able to return to that suspended state easily; to be able to make unplanned deviations; return to old ideas and goals; formulate, as well as solve, problems; and to re-formulate the problem space as their understanding of the domain or state of the problem changes” [NSF page 14, Creativity Support Tool Evaluation Methods and Metrics].
All of the NSF Workshop article Design Principles for Tools to Support Creative Thinking (p. 25-38) is relevant to this section. A quick summary of their major points (other than the low thresholds, high ceilings, and wide walls, which was quoted from this article too):
1. Support Exploration: make it easy to change all aspects of the design. A tool must be trustworthy, so that users are comfortable trying new things, and progress will not be lost (UNDO!!!) A tool should also be “self-revealing” so that it is clear to users what can be done. Elements of the tool that are hard to use will not be used. The tool should be non-obstructive to exploration, and allow the user to exert partial effort to get a partial result quickly. FAIL FAST!
2. Low Threshold, High Ceiling, and Wide walls
3. Support Many Paths and Many Styles: some people have ADD and jump between all kinds of projects between circling back to a solution. Others focus and dive deep on a single task. The latter has typically been primarily supported, but the first is a perfectly viable method of creative process as well.
4. Support Collaboration: for both a single author getting feedback, or multiple authors working in tandem, communication and distribution of creations, tricks, techniques, and examples is extremely important for usability.
5. Support Open Interchange: seamlessly interpolate with other tools, both your own and others. Working with common file formats, or enabling users to create “plug-ins” or “mods”, support this point.
6. Make it as Simple as Possible – and Maybe Even Simpler: even though simpler and better-designed tools may be regarded as ‘toys’, the real trick is offering the simplest ways to do the most complex things. Think of the Minecraft example above. Don’t succumb to feature creep if you audience doesn’t want/need those tasks!
7. Choose Black Boxes Carefully: your black box is the lowest level of abstraction (primitives) your user can work with. If you’re not trying to teach physics, let a physics engine be a potential black box.
8. Invent Things that You Would Want to Use Yourself: your tool should be enjoyable to use. If you don’t enjoy using it, why would anyone else? External validation and community recognition are important.
9. Balance User Suggestions with Observation and Participatory Processes: users don’t know what they want, what tools or feasibly, or what tools will result in a certain behavior. Designs with well-chosen parameters are often more successful than designs with fully-adjustable parameters, if fully-adjustable parameters are not needed by the user. Infer what users want/don’t want from their actions.
10. Iterate, Iterate – then Iterate Again: Just as we want users to work with rapid prototyping of ideas, so should your tool be a rapid prototype that responds directly and rapidly to user behaviors.
11. Design for Designers: design tools that enable others to design, create, and invent things. Writing software is a creative activity, and we can creatively write software for making creative software.
12. Evaluation of Tools: ….
“…[N]o single [evaluation] method or measure will be appropriate for all situations or all aspects of the complex phenomenon of creativity” [NSF page 15, Creativity Support Tool Evaluation Methods and Metrics]. The article lists a bunch of sample qualitative evaluation questions to be asked about the robustness, generalizability, effectiveness, and comparative strengths/weaknesses of the evaluated tool, as well as possible quantitative metrics to gather during studies. I am too lazy to write them all out here, since they’re like a page long. The article also compares pro’s and con’s of long-term and short-term studies, surveys, and ethnographies. The short rule of thumb is: ethnographies give you the best coverage of usability and feedback, but take so much time/resources that you should use short studies in the beginning to iterate and produce a prototype worthy of the longer ethnographic study.
The main take-aways of that article’s examples show how to extract why a user does their actions, what the user’s goal is, and help them reach it faster/easier/simpler. This approach applies at multiple levels of granularity. For example, at a low-level, users may want to click-and-drag or click twice to move an object, and if you don’t support the one they use, you may understand why a user’s interaction fails. At a high level, the user may be moving the object in order to see a different configuration in comparison to one they already have. Maybe a comparative view, examining snapshots side-by-side, may help them compare configurations more easily. These examples also illustrate why qualitative AND quantitative data is crucial to understanding, for example, why the user tried doing the same operation twenty times in a row.
Current evaluation metrics (at the article’s writing, anyhow) on performance and efficiency may be important, but they are not the only measures of a creative support tool’s effectiveness. How a tool influences a user’s problem-solving process or creative exploration needs to be better explored.
Other Cool Quotes
“work smarter, not harder” Beyond Productivity: Information, Technology, Innovation, and Creativity (2003)
“Creative ideas emerge from novel juxtaposition of concepts in working memory in the context of a creative task” NSF Workshop page 21, Creativity Support Tool Evaluation Methods and Metrics
“Almost by definition, creative work means that the final design is not necessarily known at the outset, so users must be encouraged to explore the space” From Fiscker 1994, NSF Workshop page 26, Design Principles for Tools to Support Creative Thinking
“By creating you become more creative” NSF Workshop page 34, Design Principles for Tools to Support Creative Thinking