Introduction

Over the last several years, schools have explored ways to minimize instructional disruptions caused by school closures due to adverse weather, widespread illness, or in-service teacher professional development (U.S. Department of Education, 2012). Due to enhanced technology and connectedness in schools (U.S. Department of Education, 2012) and in homes (Gray & Lewis, 2020; Hussar et al., 2020), schools are increasingly turning to eLearning as the primary mode for continuing instruction when traditional, face-to-face instruction is not feasible. As evidenced by the global COVID-19 pandemic, eLearning and online learning options are more than convenient interventions to mitigate the impact of minor instructional disruptions, but necessary strategies to maintain student learning, engagement, and academic services over extended periods of time. Additionally, some proponents of eLearning argue that continued adoption of eLearning can help students develop new skills needed in a technologically advanced workforce (Indiana Department of Education, 2020).

Before COVID-19 forced the shutdown of schools and the subsequent widespread adoption of online learning, schools were already exploring the potential benefits of eLearning. At least 12 states had formal state-level policy allowing eLearning to replace traditional instructional time in PK-12 settings, while four additional states had at least one district using eLearning for the same purpose (Digital Learning Collaborative, 2019). Among states with formalized eLearning programs, schools were typically required to comply with certain requirements (e.g., technology access, student support services, communication with parents/guardians). However, very few states required teacher training or professional development prior to implementing eLearning as a substitute for regular instruction (Digital Learning Collaborative, 2019). In the wake of COVID-19, it is unclear how eLearning will be integrated permanently into existing state statutes and policies.

The purpose of this design case is to describe our experience in designing and developing a series of professional development modules for inservice PK-12 teachers in Indiana. This case will focus on design experience through the exercise of design judgments. We believe that articulating our design judgments can attune designers to the complexities of design and the role of individual judgment in an authentic design experience.

Design Judgment

Design involves making good judgments in pursuit of desirable design outcomes. As a type of knowledge (Dunne, 1999), design judgment guides “wise action” (Nelson & Stolterman, 2014, p. 139) and gives shape and form to a design and the design process. As researchers have suggested, instructional designers’ authentic design practices do not always follow the rigid, linear design process found in common instructional design models and frameworks (Ertmer et al., 2009; Gray et al., 2015; Kirschner et al., 2002). Likewise, design judgments do not follow a formulaic rule-based decision-making process but emerge idiosyncratically from a designer’s lived experience and the particularities of the design situation (Dunne, 1999; Nelson & Stolterman, 2014). Therefore, exploring the design judgments in a specific design context can provide significant insight on how designers think and act in an authentic design context. By foregrounding the thinking and behavior of designers within a design experience, the foundational instructional design knowledge base may begin to shift from an overreliance on design models and frameworks towards more accurate and useful descriptions of design methods and practice.

Despite its potential contributions, the concept of design judgments is still largely unexplored in instructional design (Gray et al., 2015; Korkmaz & Boling, 2014; Smith & Boling, 2009). Nelson and Stolterman (2014) laid out a useful framework of design judgments (see Table 1), but also noted that this list of judgments is not comprehensive or exhaustive. Nevertheless, this framework is a useful starting place for a discussion about design judgments in instructional design.

Table 1 Types of design judgment (Nelson & Stolterman, 2014)

Design Context

The College of Education at Purdue University provides professional development opportunities for practicing PK-12 teachers. As schools and districts throughout the state of Indiana have increasingly used “eLearning Days” to mitigate disruptions to traditional classrooms, we were asked by the College of Education to design an online, professional development course for facilitating these eLearning experiences. This was designed to be hosted within the university’s learning management system (LMS; Brightspace D2L) through the college’s professional development portal. Upon successful completion of the module, PK-12 teachers would receive continuing education credits for relicensure.

The design of our professional development modules began in February 2020 with the focus on the limited use of eLearning to minimize instructional disruptions. Shortly after this project began, the COVID-19 virus began rapidly spreading across the nation, forcing schools to halt face-to-face instruction and shift towards online, remote instruction for the remainder of the 2019–2020 school year. While the initial and continued purpose of the modules was on eLearning for intermittent instruction and not as a response to COVID-19, viewing eLearning apart from remote instruction teachers were engaged in during the COVID-19 school closures was difficult. We acknowledge that short-term eLearning and long-term remote teaching through the COVID-19 pandemic have shared pedagogies and practices; however, we also believe that significant distinctions exist between these two practices. We also acknowledge the potential conscious and unconscious influence COVID-19 had on us (the designers), the participants who informed our designs, the resulting learning modules, and this reflective case description.

Design Team

Our design team consisted of an advanced PhD instructional design graduate student and a faculty member in the same program. Both of us previously taught in a PK-12 setting and had previously designed online courses. After collaboratively designing the module objectives, our roles were differentiated as described below.

Faculty Member (Project Manager)

The faculty member assumed the role of project manager but was also highly involved in content development. She reviewed and modified instructional content, created case studies, and recorded introduction videos. When needed, she solicited further guidance from the client (i.e., College of Education).

Graduate Student (Lead Designer)

Under the guidance and direction of the project manager, the graduate student took the lead on designing and developing the module content (e.g., learning activities, assignments, rubrics, interactives). When content was nearing completion, he transferred the content from the development space (i.e., Google Docs) to the implementation space (i.e., LMS).

Module Description

These online modules were designed as a self-paced, asynchronous course facilitated through Brightspace (D2L), the university’s learning management system (LMS). The course comprises five modules, including an introduction module, three instructional modules, and a conclusion module. The three instructional modules (see Fig. 1) follow a rough approximation of a teacher’s experience with eLearning, beginning with preparing the instructional environment, designing effective eLearning activities, and concluding with facilitating the eLearning experience.

Fig. 1
figure 1

Overview of content modules

Summary of Module 1: Preparing for eLearning

Module 1 was designed to emphasize the significance of the days and weeks preceding an eLearning experience. Through instructions and examples, this module focused on the creation and communication of a “Digital Learning Plan” (DLP), a document which outlines the policies, procedures, and expectations for eLearning (Koehler & Farmer, 2020). Through a case study and instructional text, teachers become aware of common eLearning challenges and reflect on the possible challenges in their school context. After they reflect on their own context, teachers are guided in creating their own DLP aligned to the challenges they identified. Once completed, teachers develop a plan to communicate their DLP with students and parents/guardians. Finally, teachers engage in a peer review process with a colleague, administrator, or instructional coach on how to improve their DLP and communication strategy. Learning management systems and survey and assessment tools are briefly addressed as supporting tools in preparing for eLearning.

Summary of Module 2: Designing eLearning

Module 2 shifted from preparing the structure of eLearning to designing specific eLearning experiences. With established eLearning procedures and expectations, a teacher’s focus can gravitate towards specific instructional objectives and aligned content and activities. Prior to designing a specific lesson, teachers are given the opportunity to explore the strengths, weaknesses, and recommendations of the different instructional modalities (e.g., face-to-face, blended, online). Next, teachers are asked to consider the relationship between the content of their anticipated eLearning lesson to the curriculum currently being addressed in the classroom (i.e., concurrent or non-concurrent). Finally, teachers are asked to create an eLearning lesson that they could teach in their classroom. To simplify this process, Merrill’s (2002) 5 Principles of Instruction are explained, and teachers are encouraged to use the associated template to structure their eLearning lesson according to these five principles (e.g., problem, activation, demonstration, activation, integration). Again, teachers are asked to solicit feedback from a colleague on their eLearning lesson and upload a revised copy of their lesson plan.

Summary of Module 3: Facilitating eLearning

With a structured Digital Learning Plan in place and a lesson appropriately designed for an eLearning environment, the last module is designed to prepare teachers to effectively facilitate eLearning activities. This module includes an expanded conversation regarding eLearning challenges with certain student populations (e.g., students without technology access, students with special needs, students without an English background) and helpful strategies to assist teachers in resolving these challenges. The focus of this module is on developing teachers’ knowledge and skills related to facilitative eLearning technology, including synchronous and asynchronous tools, collaboration tools, and tools designed to assist special populations of learners. The module concludes with an opportunity for teachers to explore tools for their own classroom context.

Design Experience

In this section, we will describe our design experience while designing and developing a series of “eLearning day” modules through the lens of design judgments as operationalized by Nelson and Stolterman’s (2014) design judgments. In doing so, we will present our major design judgments in a rough, chronological order as a narrative of our design experience (see Fig. 2). We acknowledge that our judgments were not made linearly, nor were these judgments independent from one another. Rather, these design judgments were made recursively throughout our design experience, with several design judgments acting in concert with each other to eventually materialize in our completed design. With our focus on design judgment, we wish to provide a more clear and transparent view of our authentic design experience.

Fig. 2
figure 2

Design judgment journey map

Framing Judgment

In framing judgements, designers must determine the “edges” and space of the eventual design (Nelson & Stolterman, 2014). Our given instructional scope of “eLearning Days” was too general, and we felt the need to further refine this problem space by collecting additional data. The second author shared an open-ended survey on her personal Facebook profile and received 104 responses from parents (n = 67), preschool and elementary teachers (n = 17), middle school and high school teachers (n = 15), school administrators (n = 2), and curriculum coaches (n = 3) from multiple school districts across the state of Indiana. Survey questions were differentiated according to respondent roles (e.g., parent, teacher, administrator) and explored the nature, successes, and challenges of eLearning. As we analyzed the survey data, our design frame began to form around eLearning expectations and procedures. Using this data, we created instructional objectives and outlined potential content and through this process; we created a rough plan for three instructional modules: preparing for eLearning, designing eLearning, and facilitating eLearning. Although we created instructional objectives for the third module, the scope of this module and its associated content and activities was still unclear.

We further explored our design space by interviewing 10 teachers using a semi-structured interview protocol. During these interviews, teachers were asked about their experiences with eLearning and were also given the opportunity to provide feedback on the objectives and module outlines we created earlier. These interviews were fundamental in traversing the problem space and establishing our design frame for the project. As a result of this work, we confirmed our adopted frame of preparing, designing for, and facilitating eLearning day experiences, although elements of each of these topics would be continually framed and reframed through our design process.

As previously noted, our data collection and design process coincided with the COVID-19 pandemic, and we were forced to revisit our scope and determine if it was still appropriate for the needs of our teachers. During the pandemic, nearly all teachers gained remote teaching experience, and the relevance of eLearning for limited, temporary instruction was not immediately clear. Ultimately, we decided to maintain our original frame on eLearning based on the assumption that our target participants were likely more interested in how eLearning could support and enhance traditional face-to-face learning rather than adopting online learning as a dedicated instructional practice. We also felt that the limited frame on eLearning would be more accessible to teachers who may be overwhelmed by a comprehensive professional development course on online teaching.

Appreciative Judgment

With the frame established around preparing, designing for, and facilitating eLearning experiences, we also needed to determine how the ideas and concepts within this frame would be prioritized. Nelson and Stolterman (2014) called this prioritization appreciative judgment, or the act of foregrounding principles of greatest importance. Each of the three modules required appreciative judgment to determine which principles would be emphasized and which principles would remain in the background or even discarded.

In our survey and interview data, eLearning expectations and procedures emerged as significant concepts in preparation for eLearning experiences (module 1). Teachers described numerous eLearning challenges associated with eLearning (e.g., internet access, workload, students with disabilities) and these challenges were categorized into broader themes (e.g., technology access, eLearning administration, content access) rather than addressed equally. Interestingly, many of these challenges were also influenced by parents and guardians who become central figures in the absence of a teacher. As these challenges became clear, our first module focused on preparing for eLearning by establishing expectations and effectively communicating those expectations to students and parents.

The second module within our frame focused on designing eLearning experiences. Our data suggested that eLearning was challenging when face-to-face instruction was merely replicated in the online environment without consideration of the unique limitations and affordances of the new instructional setting. Furthermore, eLearning often occurs with little notice due to emergency circumstances (e.g., weather, illness). Therefore, module two was designed to focus on illuminating the differences in instructional environments (e.g., face-to-face, blended, online) and simplifying the eLearning design process by providing a template teachers could use to efficiently “plug in” their content. Our template was based on Merrill’s (2002) 5 Principles of Instruction which we believed provided a basic structure for effective instruction and is adaptable to different instructional contexts.

Our survey and interview data also informed the design of a technology-centric final module. We initially understood that technology would be a significant part of our instruction, but the precise role of technology in our modules was continually defined and redefined. While most of the teachers we interviewed were comfortable and familiar with technology, we assumed that our future learners would likely vary in their technology knowledge, skills, and attitudes. As a result, we decided to frame technology as tools “in the service of” teachers performing typical teaching tasks (e.g., planning lessons, presenting instruction, communicating with students and parents). In contrast to technology-first approaches that prioritize specific technology tools and their notable features and functions, we decided to focus our final instructional module on common teaching responsibilities (e.g., communication with parents and students, supporting non-English learners) and technologies that can assist teachers in carrying out those responsibilities. This technology integration approach also led us to include several “technology spotlights” scattered throughout the modules to highlight technology’s role in supporting typical teaching tasks. This subtle appreciative judgment foregrounded the teacher’s instructional responsibilities and relegated technology to a supporting role.

Throughout the modules, we determined what topics were less significant through the application of appreciative judgment. Informed by our conversations with teachers, topics and ideas mentioned infrequently by teachers (e.g., academic integrity, teacher work/life balance, and single sign-on tools for multiple technologies, apps, and websites) were ultimately excluded from the modules.

Instrumental Judgment

Vital to the design and development process was our selection and utilization of tools. This design judgment, called instrumental judgment (Nelson & Stolterman, 2014), also includes determining when and the duration in which different tools are used during the design process (Gray & Boling, 2018). While the final modules were required to be housed in our institution’s LMS, we determined that we would use Google Docs as a collaborative tool to design and develop the module content. The ability for us to track changes, work collaboratively on the same document, and communicate via a chat feature was critical to our asynchronous and remote design process. Additionally, our university had just transitioned to a new LMS and our lack of experience with this new platform and the scarcity of support materials prompted us towards more familiar and usable tools. Our approach was to develop as much content as possible in Google Docs prior to uploading it to the new LMS platform (see Fig. 3).

Fig. 3
figure 3

Screenshot of collaborative work in Google Docs

As previously mentioned, instrumental judgment includes when and how long tools should be used. While adopting Google Docs for design was a helpful initial strategy, we learned that our lack of familiarity with the final platform ultimately inhibited our design process by limiting what we could imagine and ultimately create. As we migrated our content from Google Docs to our LMS during later stages in the project, we also realized that the unique limitations and affordances of the final platform required us to modify the instructional activities and content. For example, we had originally planned for learners to create regular blog posts to reflect on their experiences and the new instruction. However, this blog feature was not available within our LMS, and our design was modified to fit this limitation. As we maintained the intentional use of Google Docs for most of the design and development process, we ultimately lost valuable time trying to reconfigure our design to fit the LMS.

Default Judgment

Default judgments occur in or near the borders of unconscious thought. Default judgments are made instinctually in response to a design situation without conscious deliberation or application (Nelson & Stolterman, 2014). As we began our design process, we instinctively began generating and discussing the course objectives, our desired outcomes for the professional development modules. These objectives were discussed, defined, and redefined repeatedly over the course of the project, but our default to objective-making as an initial design activity was instinctual. This default tendency was likely due to our formal training in education and instructional design and the influence of design models prevalent in these fields (e.g., ADDIE, backward design). Unlike design models and processes from other fields which may delay the formalization of instructional objectives (e.g., rapid prototyping, design thinking), our default judgments followed the guidance of familiar design models which foreground the construction of instructional objectives as an early design activity.

Additionally, we exercised default judgment as we instinctually viewed our LMS as the primary tool for content delivery. Our learning modules were required to be hosted in our LMS, but we were not required to use the LMS to present the content or conduct instructional activities. Once our objectives were created, we instinctively began envisioning and designing LMS modules and pages with content, images, videos, and interactives. We never considered or discussed other instructional options that would use the LMS as more of a hosting platform rather than the primary mode of content delivery. Default judgments occur without conscious awareness although options and alternatives to those decisions exist. For example, we could have developed content primarily through Articulate Storyline, Captivate, or another authoring tool and merely hosted the content in our LMS. However, these discussions never occurred, and we never questioned the role of the LMS as a primary tool in delivering content.

Connective Judgments

Connective judgments work together with compositional judgments to bring elements of a design into a unified whole. The difference between these two judgments, however, is that connective judgments bind together different elements that are specific to a design situation (Gray et al., 2015; Nelson & Stolterman, 2014). We exercised our connective judgments as we sought to align our module objectives and activities with the concerns and challenges described by the teachers we interviewed, and by subsequently aligning these objectives to the content and the instructional activities of our learning modules. As we considered content and activities, our connective judgments provided the inclusion criteria for our modules; content and activities were only included if they closely aligned to our objectives and the identified needs of teachers.

Compositional Judgments

Compositional judgments include the understanding of acting with reference to the “relational whole” (Nelson & Stolterman, 2014, p. 153). Compositional judgments are made as designers acknowledge the system of elements involved in a design and strive to harmonize all elements into a unified assembly. In this design situation, we established a consistent and predictable structure for our modules and submodules. Each module begins with a short, introductory video and a short list of instructional objectives. To help illustrate the principles in context, a case study was provided along with several reflection questions. Each module concludes with an application assignment and a brief review of the module’s objectives.

Compositional judgments can also include recognizing the role of a singular learner experience (i.e., modules) within a larger assembly of teaching and learning attitudes, beliefs, and practices. While the frame of our learning modules was on eLearning days, we also wanted to encourage teachers to consider their general technology integration beliefs and practices. As we noted from our conversations with teachers we interviewed, teachers who integrated technology consistently in their classrooms felt more comfortable and confident about eLearning (during COVID-19 or otherwise) than those teachers who had limited technological experience. Through these modules, we hoped to inform this broader technology integration conversation and viewed our modules as a single element of a larger composition of technology integration practices and beliefs. While we included a technology integration conclusion in our modules, we ultimately decided not to vary too far from our original plan for eLearning instruction.

Navigational Judgments

Designers use navigational judgment as they seek a forward course through unpredictable situations (Nelson & Stolterman, 2014). Because design is highly dependent on context which can vary in a given moment, navigational judgment is adaptive and assists designers in adjusting their approach to the fluctuating situational realities of a design situation. As we finalized the content of the first two modules in Google Docs, we began the process of migrating our content to our LMS. Initially, an external instructional designer was given this task, but he was limited by his lack of familiarity with the new LMS and the available support materials. Additionally, with the impact of COVID-19 continually rippling across campus, particularly among instructional designers, he was limited in his available time to assist in this content migration process. As a team, we had to adapt our work to include the substantial task of content migration given this new reality. Additionally, the time needed to move the content to our LMS required us to modify some of the instructional activities we had hoped to include. For example, some of the Articulate Storyline interactives we had designed were either simplified or removed and replaced with static content. These situational factors required us to make careful navigational judgments so that we could complete the development by our given project deadline.

Quality Judgments

As we began the process of developing the modules in our final instructional environment (i.e., Brightspace D2L), our attention shifted more towards the quality and appearance of our modules. Quality judgments are associated with craftsmanship and are made through the interplay of designer’s personal preferences, external standards, and the uniqueness of the design context (Nelson & Stolterman, 2014). As we continued to develop the content in the final LMS platform, content was broken down and organized into sub-modules and individual pages to prevent learners from getting overwhelmed and to establish an appropriate pathway through the content. Interactivity, an element of instructional quality which goes beyond passive learning, was improved by adding both internal resources (e.g., single-user discussions, images, videos) along with external elements (e.g., Articulate Storyline) to better engage the learners (see Fig. 4). Additionally, quality judgments were continually made leading to new or adjusted content.

Fig. 4
figure 4

Example of interactive content

Appearance Judgments

Appearance judgements consist of the stylistic feel of a designed artifact, including such ideas as “form, occurrence, essence, and excellence” (Nelson & Stolterman, 2014, p. 151). When we first migrated instructional content into our institution’s LMS, the content appeared as an overwhelming, monolithic block of text. We immediately recognized the need to improve the aesthetic of this instruction (appearance judgment) and decided to use LMS templates which formalized our style (e.g., font, font size, headings, font color, banner image) and provided a consistent aesthetic throughout the modules and sub-modules. Navigational headings with established heading styles were also added to break apart text and make it more accessible. As we began to finalize our content in our final LMS environment, appearance micro-judgments were continuously made as font, font sizes, and font color were modified and as images and graphics were added and adjusted. In our design experience, elements of aesthetics became important as we sought to create a predictable and consistent appearance throughout the various learning modules and activities (see Fig. 5).

Fig. 5
figure 5

Screenshot of content page without and with templates

Deliberated Offhand Judgments

Similar to default judgments that are made automatically and often without conscious awareness, deliberated offhand judgments occur when nearly automatic judgments resurface into a designer’s conscious awareness. Now conscious of these design thoughts, designers review and potentially modify these default judgments. Once deliberated, these offhand judgments may again recede into a designer’s unconscious thought (Nelson & Stolterman, 2014).

In the design of our modules, we developed the content to guide the learners through a linear eLearning experience from preparing, planning, and facilitating eLearning activities. We further supported this linear progression through the modules by establishing content release conditions which deployed new content to the learner when prerequisite instructional activities were completed. Initially, setting these release conditions and establishing this linear pathway through the learning experience was made without conscious deliberation. Intrinsically, we designed a preferred way for our learners to progress through the content. However, as we began testing the modules, we found that the release conditions failed to operate predictably and consistently, and we feared that learners would be hindered from completing the module. These early tests required us to revisit our near-automatic judgments on learner control, and we decided to allow learners to maintain control of both the pace and progress through the modules by removing all restrictions to the content. Allowing for greater learner control avoided potential technical difficulties surrounding our release conditions, provided learners with a greater sense of control and autonomy of their learning, and allowed learners to have full access to the content from the beginning.

Core Judgment

The often-indiscernible values and meanings that unconsciously drive designers’ actions during a design experience are called core judgments (Nelson & Stolterman, 2014). These core judgments are derived from a designer’s lived experience and are rooted in one’s values. As former teachers and current educational designers, our design judgments were influenced by our shared values of meaningfulness and essentialism. Designing for meaningfulness required us to acknowledge our limited perspective and become familiar with the experiences, concerns, challenges, and successes of teachers. As we more fully understood these experiences, their perspectives became our curriculum and the filter for which all design elements were considered.

Valuing meaningfulness encouraged us to consider potential participants operating in their authentic lives. Meaningfulness, described here as the state of having relevance and salience to one’s life circumstance, encouraged us to prioritize instructional activities that facilitated conversations and actions within their schools instead of activities contained within the LMS. For example, we opted against a cohort instructional model in favor of a self-paced model of instruction. In doing so, the collaborative peer reviews required in our modules were designed to be completed by colleagues within a teacher’s real-world school environment. Our commitment to meaningfulness encouraged us to design activities that would strengthen teachers’ existing relationships and practices rather than establish temporary learning environments detached from authentic practice.

Valuing essentialism as a core judgment encouraged us to include only enough instructional activities that were required to help our learners achieve our instructional purposes. Again, our experience as former teachers suggested that the multiple roles and responsibilities of teachers require an effective learning experience to be streamlined. For example, our original plan was to ask teachers to complete regular blog posts in response to several reflective questions. However, when we realized that this functionality did not exist within our LMS, we decided to convert most of these questions into self-reflection questions (not requiring a written response) at the end of a page of instruction. When reflections were necessary to be adequately prepared for the upcoming content, a single-user discussion post was included to ensure that the teachers completed the reflection activity. We felt that these reflection questions were worthwhile, but in considering our core values, we did not want to deflate the value of our instruction by including elements that could be considered as “busy work.”

Judgment Making

We have described a narrative of our design experience through our design judgments. In doing so, we have presented these design judgments in a rough chronological order to provide a description of this experience. However, we acknowledge that our experience was much more complex and iterative, with “clustered and layered” (Gray et al., 2015, p. 40) design judgments made interdependently with one another. Parsons et al. (2020) also described this layering, suggesting that a single judgment made with several other judgments made in the background. Additionally, we acknowledge other judgments made consciously and unconsciously throughout this design experience. We agree with Gray et al. (2015) who concluded that design judgments occur constantly through the design experience. Rather than identify and detail a comprehensive list of all design judgments, our purpose was to describe significant design judgments that had a particularly strong influence on our final design.

Conclusion

Through this design case, we have described our experience in designing a series of online instructional modules for practicing PK-12 teachers. In addition to describing these modules, we have detailed our experience through the lens of design judgments, highlighted by the numerous conscious and unconscious decisions individuals and design teams make during the design process. We have shown that these judgments are constant throughout the design process and often occur simultaneously with each other in interesting and sometimes unpredictable ways. We acknowledge that our design experience was not a linear process as suggested by some design models and frameworks but was complex and iterative through constant cycles of analysis, design, and development.

In a field characterized by a heavy reliance on design process models and frameworks (e.g., ADDIE, 4C-ID; Gagne & Briggs; Göksu et al., 2017) designers need a clearer concept of design practice. Our reflection on these judgments have highlighted the impactful thought processes that were involved repeatedly throughout every aspect of this design experience. We suggest that closer investigations into design judgments can give an authentic view of design practice and can sensitize designers to the subtle nuances of design activity.