Problem Solving Knowledge Transfer: An Expert's Perspective

Picture by: Diana Kimbal

Article and Author Information

DeAnna Myers wrote this article in December 2012 for the Capstone 3 Research Analysis and Interpretation course. This executive summary assignment is the culmination of a nine-month capstone research project. DeAnna will graduate from the MSLOC program in 2013 and is the Training and Development Manager at Sargent & Lundy.

Twitter: @MyersDeAnna

Abstract

Engineering firms in the power generation industry, like many knowledge organizations, commonly attempt to sustain their intellectual capital by utilizing in-house experts to train novice staff, but an expert’s ability to predict what is necessary to transfer knowledge to novice learners can be compromised by biases associated with an expert’s superior level of expertise (Hinds, 1999). This study surveyed how one organization’s experts perceive the task of novice-level knowledge transfer, and compared these perceptions to feedback from novices who participated in their classes. Findings from the study suggest that experts misdiagnose novice learning needs and though they usually attempt to adjust content to accommodate the novice learner, those adjustments are less successful than the experts perceive.

The Growing Focus on Knowledge Transfer

By the year 2015, nearly half of all engineers now working in the power generation industry will be eligible to retire, taking with them a significant slice of the industry’s knowledge and expertise (Bogdanowicz, 2010). Part of the answer to this predicament lies within a company’s ability to leverage factors that can promote knowledge transfer while mitigating the factors that inhibit it (Argote, & Miron-Spektor, 2011). Leveraging this resource requires experts to transfer expertise “from those who have it to those who need to know” (Hinds, Patterson, & Pfeffer, 2001, p. 1232). There are, however, obstacles to this process. Biases that are typical to an expert level of expertise can actually make it more difficult to communicate or understand the knowledge being transferred (Hinds, et al., 2001; Nueckles, Winter, Wittwer, Herbert, & Huebner, 2006). Novice feedback from more than 100 classes taught by one organization’s experts speaks of skipped steps, misdiagnosed levels of complexity and an inability to apply class content (McFadden, Myers, & Zavala, 2011).

How an expert perceives the task of knowledge transfer represents a virtual void in current literature. This study focused on experts’ perspectives on what would be necessary to transfer knowledge to a novice audience, and seeks to answer the question,

“How well do experts diagnose what will be necessary to transfer problem-solving knowledge to novice learners in an understandable way?”

This study also asked novice learners to provide feedback regarding how well they learned from courses designed by experts and whether aspects of the instruction such as level of complexity, relevance, topical order were appropriate for their level of expertise. Therefore, the secondary question explores the following:

“Do expert-initiated adjustments affect the novice learner’s ability to learn from the instruction?”

Findings from this study advance the topic of novice learning within knowledge organizations that frequently utilize in-house experts to train novice staff. Results of this study also provide learning and development (L&D) teams, instructional designers and expert staff with a more informed foundation from which to create responsive learning for novice employees. Such advances improve an organization’s ability to preserve its intellectual capital by utilizing the organization’s most knowledgeable, in-house resources more effectively.

Research Methods: A Tale of Two Surveys

The Method

In order to answer the research questions, two surveys were designed to collect analogous data; one was tailored to the expert and one to the novice perspective. Both surveys assessed the same variables through custom questions using 5-point Likert scale, multiple option or open-ended question formats.

Key variables in this study were selected after an extensive literature review of empirical studies that examined biases affecting knowledge transfer between expert and novice levels of expertise. Independent variables included:

  • Oversimplification or skipped steps resulting in the inability to apply the learning
  • Inappropriate level of complexity (either too complex or too simple) for the learner
  • Illogical order in the way topics were introduced
  • Failure to define technical terms
  • Irrelevancy to current work tasks

(Byram, 1997; Hinds, 1999; Langer & Imber, 1979; Kirschner, P. A., Sweller, J., & Clark, R. E. 2006)

The expert survey consisted of 17 questions; three open-response and 14 closed response questions. Seven closed-ended questions addressed the five bias variables listed above; three questions requested demographic and experience data; two closed questions addressed the need and use of adjustments and two closed questions assessed how well the expert believed participants learned from his instruction. The three open-ended questions addressed factors that the expert believed helped or hindered training and best practice adjustments to accommodate the novice learner.

The novice survey consisted of 18 questions; four open-response and 14 closed response questions. Eight closed-ended questions addressed the five bias variables listed above; two questions requested demographic data; two closed questions addressed the need for adjustments and two closed questions assessed how well the novice learned from the instruction. The four open-ended questions addressed factors that the novice believed helped or hindered novice learning and best practice adjustments to accommodate the novice learner.

About the Participants

The sample included expert and novice engineers from one engineering firm. Expert participants (n=42) had at least 10 years of practice in the associated industry and had developed or taught a class designed specifically for in-house novice staff. Novice participants (n=96) had no more than four years of industry experience and had completed an in-house class designed specifically for novice staff.

Qualifying expert responses (n=41) represented 87% of the eligible population. Qualifying novice responses (n=94) represented 39% of the eligible population. The difference in responding percentages was primarily a result of a limited number of courses being offered for various groups of novices during the data collection period. Three responses were rejected for this study as they did not meet the required experience criteria. Respondents represented every business group and engineering discipline within the organization and had participated in novice classes on site (instead of online or through a video broadcast of the class).

Volunteer sampling was used to recruit novice participants from communities of practice and classes designed specifically for novice staff. Experts were personally invited to participate based on company reports of them having taught novice-level classes.

Analyzing the Data

Adjusting Learning to the Novice

Figure 1The first level of analysis measured whether experts believed they were able to predict novice learning needs and whether they felt it necessary to adjust their instruction for a novice audience. Expert feedback showed that 75% of experts (n=31) surveyed felt predicting novice needs was not difficult. In fact, 73.5% (n=30) of expert respondents managed novice staff indicating that experts had direct contact with novices prior to class design. Additionally, 95% (n=40) of the experts confirmed that they adjusted their instruction to accommodate the novice learner. Exhibit 1.0 summarizes the nature of instructional adjustments made and reported by the experts.

Experts reported that in 55% (n=22) of the cases, they would have applied additional adjustments or attempted to better accommodate the novice learner if they had had more time to develop the course. Experts also reported that they were able to fully apply perceived best practices to their instruction only 20% (n=8) of the time. Experts cited a lack of time/ budget (n=12) or an unawareness of audience demographics (n=7) as the primary reasons for their inability to apply their best practices to their instruction. Finally, 82.5 % (n=33) of experts felt that novices in their courses learned the material and had enough detail to apply it. Appendix A provides a detailed summary of these results.

Learning Needs Through Two Perspectives

Figure 2The second level of analysis was to assess whether expert perceptions were accurate based on actual learner (novice) feedback. Parallel questions from both surveys were evaluated to ensure that questions could be paired for comparison. Each pair of questions proved acceptable (p≥.05) for comparison using robust tests of homogeneity of variance. ANOVA tests assessed the significance of mean differences in perception between the two groups for the five variables. As shown in Figure 2.0, each category showed a statistically significant difference (p≤ .05) except for whether class complexity was appropriate for a novice learner.

Perceptions on Learning

The most significant differences occurred around how well each group felt the participant learned and their ability to apply the learning to their work environment. Novices rated both variables significantly lower than experts. These results support empirical studies that have found that experts routinely overestimate the learning performance of novice learners (Dane, 2010). These findings support the notion that experts significantly misdiagnose novice learner needs.

Figure 3Qualitative data were coded to characterize group perspective on instructional elements that helped or hindered novice learning. Qualitative feedback also provided more specific detail of how expert adjustments affected a novice’s ability to learn as a means of addressing the second research question. Finally, qualitative feedback was assessed for frequently used words. Figure 3.0 provides a summary of the word counts conducted.

Results - A Marked Difference of Opinion

Qualitative feedback supported quantitative analysis of the two groups, confirming significant, perceptual differences in key learning components and their effect on a novice’s ability to learn. Exhibit 4.0 provides a summary of the top practices each group perceived as helping and hindering learning based on qualitative feedback. Appendices C & D provide detailed summaries of practices that encourage or hinder learning as cited by each population.

Figure 4
  • Knowledge Level and Complexity – Though nearly 30% of the expert group reported that they had overestimated the novice knowledge, only 8% of novice respondents felt that course content was too complex for their current knowledge level.
  • The Need for Engagement - Novices cited unengaged instructors as the primary factor in not learning well, describing “slides that were just text that the instructor read from,” “completely monotone presentation,” “no opportunity to interact, just a lecture.” Experts cited this as hindrance in less than 9% of the responses.
  • Data Overload – 18% of novices report “packing too much detail into the time allowed” as a hindrance to learning well in the classes considered. Only 6.5 % of experts blamed too much data for not learning well.
  • The Need to Understand the Audience – Experts recognized the role that better knowledge of the novice audience could play in knowledge transfer by citing the need for better audience education/ experience information in 14% of their responses.
  • Practice, Practice, Practice – Novices cited the need for homework, practice scenarios, tests and discussions 13% of the time as an element that was the cause of good learning. Experts cited practice in only 4% of their corresponding responses.
  • Attention Please – Seventeen percent of expert responses said that when novices didn’t learn well, the cause was due to not paying attention in class.
  • The Needs for Concrete/ Situated Examples – 33% of the novice group said they benefited from concrete examples (physical artifacts, visuals) and examples based on actual scenarios thereby supporting empirical findings that the more novice the learner, the more concrete/ situated the example must be (Hinds, et al., 2001). Experts also recognized the need for concrete and situated examples as the most important learning aid for novices at 21%.

Limitations

Respondents in this study represent one firm’s engineering staff. Trends for other firms may vary. In an effort to preserve anonymity, the survey was not designed for a one-to-one comparison of learner feedback for a specific class. By spanning all available groups and disciplines, it is likely that feedback covered most classes. This study examines only classroom-based training and workshops: individual coaching, mentoring and virtual participation is excluded from the scope of this study. As several respondents were non-native speakers, language could have been a limitation in the communication of qualitative data in a small percentage of the cases. The study is subject to the assumption that responses of both sample groups reflect applied versus espoused values. Finally, the risk of biased feedback stemming from the desire to protect or criticize instructors who were also the respondent’s supervisor was a possible bias in the novice population.

Interpretations & Recommendations: Progress through Awareness

The ability to transfer knowledge is a topic of increasing significance as organizations try to bridge the gap between retiring experts and novice successors (Bogdanowicz, December 2010; Hinds, Patterson, & Pfeffer, 2001; Santosus, 2003). Results reported here suggest that experts are in fact, challenged in diagnosing the learning needs of novice learners. Not only were experts significantly misaligned with how well novices learned from their instruction, but they also were unable to adjust the content of their courses in a way that mitigated typical expert biases, such as skipped steps, data overload and lack of examples to improve the offering. An expert’s inability to gauge novice learning needs has several possible explanations. The “expert blind spot” is one cognitive context whereby experts overestimate a non-expert’s ability to absorb and integrate the information (Nathan & Koedinger, 2000). This context amply describes both the expert’s inability to diagnose content and the inclusion of more information than the novice could process.

Chi’s widely-cited study on cognitive problem-solving differences across levels of expertise found that novice engineers’ ability to problem solve relies heavily on surface details and concrete examples to help them build their cognitive problem-solving scenarios (1981). Even though experts were aware of these novice needs, the tendency to actually adjust training content to incorporate these elements was low. Future research could take the shape of a longitudinal study to explore whether the integration of such adjustments could narrow the gap between an expert’s ability to diagnose needs and novice learning levels.

Though experts claimed that predicting novice learning needs was not a difficult task, they frequently cited their misdiagnosis of a novice’s knowledge level as a key reason novices did not learn well. Additionally, experts reported that knowing the audience’s knowledge and experience level during class design was a practice that experts felt would help them design more responsive courses. These results suggest that pre-tests or a more robust needs analysis, that could provide richer audience demographics, might help instructors tailor class content.

Based on word counts and best practice feedback from experts, such as, “hand out [the] presentation with a blank space for notes,” experts exhibited a heavy tendency to approach knowledge transfer as if it were a business presentation. However, presenters generally deliver “in a one-way stream to an audience,” with skills sets around use of voice and design of PowerPoint decks. Trainers on the other hand strive to prompt thought and encourage incorporation of the knowledge to a learner’s memory; a very different communication purpose (Myerson, 2012). This disconnect could account for the lack of discussion, practice opportunities and some of the subsequent inattentiveness of learners.

Novices aligned strongly with empirical studies, recommending discussion, practice, concrete examples and project relevancy to make the link between instruction and application (Argote & Miron-Spektor, 2011; Bilalić, McLeod, & Gobet, 2008; Byram, 1997; Chi, Feltovich, & Glaser, 1981). In areas where the two populations aligned more closely, such as level of complexity and comprehension of technical terms, qualitative novice feedback was virtually silent. The cause of this discrepancy presents an opportunity for future research.

The two groups often characterized the same practice differently such as in the case of relevance. Experts characterized relevance as the need to link learning to current work tasks, “why the material is relevant to their job,” thereby exhibiting an awareness of the need for relevance and learner engagement. However, novices characterized relevance as a topical issue stating, “Too much non-relevant…details that don’t mean anything,” and “too much of a focus on other (engineering) disciplines.”

One population excluded from the study was the intermediate population. Vygotsky, as cited in Driscoll, discussed the benefit of learning from those who are slightly more advanced in their knowledge or expertise, calling the gap between present and future knowledge, the zone of proximal development, an ideal space to initiate skill development. He recommends the instructor be of a slightly higher level of expertise, acting as scaffolding to the learner (Driscoll, 2005). Future research could explore the benefits that an intermediate instructor, much closer to experience of the novice, could bring to the learning experience.

Practice Implications

The knowledge resulting from this study serves to advance organizational awareness of novice learning needs and the obstacles that arise in an effort to meet those needs by utilizing the superior knowledge of in-house experts. This work has been primarily concerned with confirming the presence of bias and the challenges to an expert’s ability to diagnose novice learning needs. The results of this study add clarity to the increasingly common business practice of utilizing experts to train novice staff in knowledge organizations. Additionally, the novice response to expert misdiagnoses provides organizations with several practical considerations to study as they attempt to close the gap through more effective course adjustments.

Given the confirmed presence of expert bias, organizations will naturally want to explore ways to mitigate the effect of these cognitive characteristics associated with higher levels of expertise. Extensive research has been conducted with minor success on methods of mitigating the expertise-based biases described in this study (Byram, 1997; Dane, 2010; Hinds, et al., 2001). Though stubborn in nature, expert bias would benefit from a clearer definition of how it appears in organizations and future research addressing possible solutions.

A logical next step for this effort would be to make experts aware of the opportunities for adjustment and help them apply the knowledge to their existing courses. Subsequently, the study would be repeated at a later date to test for more closely aligned novice learning feedback.

Most importantly, this research provides a unique insight into the schema of an expert as he approaches the task of knowledge transfer to a less-experienced staff. The significance of this study lay in its potential to make knowledge organizations, L&D teams, instructional designers and experts more aware of obstacles that could impede their ability to reliably sustain organizational knowledge capital into the future.

References

Argote, L., & Miron-Spektor, E. (2011). Organizational Learning: From Experience to Knowledge. Organization Science, 22(5), 1123-1137. doi: 10.1287/orsc.1100.0621

Bilalić, M., McLeod, P., & Gobet, F. (2008). Expert and “novice” problem solving strategies in chess: sixty years of citing de Groot (1946). Thinking & Reasoning, 14(4), 395-408. doi: 10.1080/13546780802265547

Bogdanowicz, A. (2010). Help Wanted: Power Engineers:Working to increase the dwindling supply of power professionals. The Institute. Retrieved from http://theinstitute.ieee.org/briefings/business/help-wanted-power-engineers543

Byram, S. J. (1997). Cognitive and motivational factors influencing time prediction. Journal of Experimental Psychology: Applied, 3(3), 216-239. doi: 10.1037/1076-898x.3.3.216

Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5(2), 121-152. doi: 10.1207/s15516709cog0502_2

Dane, E. (2010). Reconsidering the trade-off between expertise and flexibility: a cognitive entrenchment perspective. Academy of Management Review, 35(4), 579-603. doi: 10.5465/amr.2010.53502832

Driscoll, M. (2005). Psychology of learning for instruction (pp. 223-263). New York: Pearson Education.

Hinds, P. J. (1999). The curse of expertise: The effects of expertise and debiasing methods on prediction of novice performance. Journal of Experimental Psychology: Applied, 5(2), 205-221. doi: 10.1037/1076-898x.5.2.205

Hinds, P. J., Patterson, M., & Pfeffer, J. (2001). Bothered by abstraction: The effect of expertise on knowledge transfer and subsequent novice performance. Journal of Applied Psychology, 86(6), 1232-1243. doi: 10.1037/0021-9010.86.6.1232

Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86. doi: 10.1207/s15326985ep4102_1

Langer, E. J., & Imber, L. G. (1979). When practice makes imperfect: Debilitating effects of overlearning. Journal of Personality and Social Psychology, 37(11), 2014-2024. doi: 10.1037/0022-3514.37.11.2014

McFadden, T., Myers, D., & Zavala, J. (2011). [30 Day Learning Reports: 2011 Course Review]. unpublished raw data. Chicago.

Myerson, D., PhD. (2012). The key differences between presenting, training and facilitating, Retrieved from http://www.mci.edu.au/article.php?article_id=56

Nathan, M. J., & Koedinger, K. R. (2000). An investigation of teachers' beliefs of students' algebra development. Cognition and Instruction, 18(2), 209-237.

Nueckles, M., Winter, A., Wittwer, J., Herbert, M., & Huebner, S. (2006). How do experts adapt their explanations to a layperson's knowledge in asynchronous communication? An experimental study. User Modeling and User-Adapted Interaction, 16(2), 87-127. doi: 10.1007/s11257-006-9000-y

Santosus, M. (2003). When (or if) the boomers say bye-bye. CIO, 17(4). Retrieved from http://books.google.com/books?id=Jg0AAAAAMBAJ&pg=RA1-PA122&lpg=RA1-PA122&dq

Appendix C - Practices that Encourage Learning Well

Appendix C

Appendix D - Practices that Hinder Learning Well

Appendix D
+ Read More
blog comments powered by Disqus
Request More Information
Apply Now