Facilitating Instructor Adoption of Inquiry-Based Learning in College Mathematics
Int. J. Res. Undergrad. Math. Ed.
Facilitating Instructor Adoption of Inquiry-Based Learning in College Mathematics
Charles N. Hayward 0 1
Sandra L. Laursen 0 1
Marina Kogan 0 1
0 Ethnography & Evaluation Research, University of Colorado Boulder , 580 UCB, Boulder, CO 80309-0580 , USA
1 Charles N. Hayward
Faculty development workshops are one strategy for increasing instructor use of evidence-based teaching practices that are known to improve student outcomes in mathematics and other STEM (science, technology, engineering and mathematics) disciplines. Yet relatively little is known about the impact of professional development on teaching practice in higher education. We report findings on participant outcomes from a series of annual, weeklong professional development workshops for college mathematics instructors about Inquiry-Based Learning (IBL) in undergraduate mathematics. We gathered data from surveys with the 139 workshop participants and interviews with a subset of 16 participants. These workshops were found to be effective in encouraging instructors to try this student-centered approach to teaching mathematics, as 58 % of participants reported implementing IBL strategies in the year following the workshop they attended. Analysis suggested that certain features of the workshops supported participants' adoption of IBL strategies. The findings pointed to the importance of (1) sharing broad, inclusive definitions of IBL, (2) representing viewpoints and experiences from diverse institutional contexts, (3) allowing sufficient time within the workshop to explore and revisit topics, (4) addressing common concerns such as content coverage, student resistance, and skills to implement IBL, and (5) providing ongoing follow-up support and inclusion in the community of IBL practitioners. We also make connections with studies of the impact of instructional development in other STEM disciplines and share implications for effective professional development in general.
Mathematics; Inquiry-based learning; Student-centered; Pedagogy; Professional development; Workshops
Success in mathematics courses is essential for all science, technology, engineering,
and mathematics (STEM) fields
(Seymour and Hewitt 1997)
, and indeed mathematics
serves as a gateway to completing any college degree
(Stigler et al. 2010)
studies have linked students’ persistence in STEM majors to differences in instructional
(Freeman et al. 2014; Seymour and Hewitt 1997)
. For example, in calculus
courses, there is a positive association between students’ persistence in STEM majors
and their report of the frequency with which their calculus instructor uses
studentcentered techniques (Ellis et al. 2014).
Inquiry-based learning (IBL) is a form of active, student-centered instruction in
mathematics that helps students develop critical thinking through exploring
looselystructured problems and by constructing and evaluating mathematical arguments
(Prince and Felder 2007; Savin-Baden and Major 2004)
. In the particular tradition of
IBL that we have described elsewhere
(Laursen et al. 2014)
, IBL is based on the
teaching practices of mathematician R.L. Moore (1882–1974) (Mahavier 1999).
argues that the important features of BMoore Method^ are Bregular
interaction with the students so that the instructor knows how well the students
understand the material^ (p. 340) and the use of challenging problems. In this way
IBL practice is based not on strict procedures for students, but on the commonalities of
the learning environment and student outcomes instructors aim to create
(Coppin et al.
, setting up a classroom environment where students become creators rather than
recipients of knowledge through discovering, presenting, and debating mathematics.
Yoshinobu and Jones (2013)
explain that the term inquiry-based learning
encompasses the BMoore Method^ and other approaches that share the spirit of student
inquiry through two core features of (1) deep engagement with mathematics and (2)
collaboration with peers. Consistently across these definitions, there are no rigid
prescriptions for content or pedagogy, but rather guiding principles that shape
instructors’ curricular and pedagogical choices. Together, these features support students’ deep
learning of mathematical concepts
(McCann et al. 2004; Moon 2004)
of positive attitudes, beliefs and capacities that support learning and problem-solving in
(Hassi and Laursen 2015; Kogan and Laursen 2014; Laursen et al. 2014)
Even though content or pedagogy are not strictly prescribed, IBL classrooms share
similarities with each other and are markedly different than non-IBL classrooms.
Whereas non-IBL classrooms are characterized by students frequently listening to
instructors talk, in IBL classrooms, much of class time is spent on student-centered
activities such as working in small groups, student presentations, and discussions
(Kogan and Laursen 2014; Laursen 2013; Laursen et al. 2014)
. Interaction among
students and with the mathematical ideas is enhanced either through students
individually presenting problems or proofs which they have worked on before class
followed by full group discussion of those presentations, or through small,
collaborative group work on deep mathematical activities along with full or small
group discussions. Yoshinobu and Jones (2012, 2013) provide rich examples, and
explain that the
Bcore idea is that students are engaged in an apprenticeship into the practice of
mathematics. Students actively participate in contributing their mathematical
ideas to solve problems, rather than applying teacher-demonstrated techniques to
similar exercises. …Students do mathematics like research mathematicians do
mathematics^ (2012, p. 307).
Despite variation in the extent and types of instructional activities employed, the use of
IBL strategies in college mathematics courses is associated with affective gains and
greater persistence with mathematics majors for women students, and with improved
grades for low-performing students
(Kogan and Laursen 2014; Laursen et al. 2014)
While IBL in mathematics is associated with positive student outcomes, a major
challenge for educational reform lies in getting large numbers of faculty to use
researchsupported teaching methods that yield such outcomes
(Fairweather 2008; Henderson
and Dancy 2007, 2008, 2011)
. As individual instructors begin to teach with IBL or
similar approaches, many must make a transition from traditional lecture methods to
more student-centered teaching approaches. [For an example, see
Making such instructional changes can be difficult, but professional development
workshops are one way to support instructors who aim to do so. Workshops are the
preferred method of National Science Foundation (NSF) program directors for
encouraging faculty uptake of student-centered teaching methods, especially when workshops
are Bin-depth, multi-day, immersive experiences with follow-up interaction with the PI
as participants implement the new strategy in their own institutional circumstances^
(Khatri et al. 2013, p. 1)
. There is evidence to support this view, as
Lattuca et al. (2014)
found that among six different types of professional development activities for
engineering faculty, the use of student-centered pedagogies was most strongly correlated
with attending a workshop.
In this paper, we report on instructor uptake of inquiry-based learning instructional
methods following intensive, week-long IBL-focused workshops. In our role as
evaluators for these workshops, we also conducted research to explore factors that affected
uptake of IBL methods by the workshop participants. In this paper, we focus on broadly
applicable findings related to central concerns for instructors who are making a
transition to IBL teaching, and on workshop features that help to make this process
smoother. We frame these findings with a three-stage model of instructor change
Paulsen and Feldman (1995)
, based on
) theory of change
in human systems.
In their instructor change model,
Paulsen and Feldman (1995)
stages of (1) unfreezing, (2) changing, and (3) refreezing. During unfreezing
instructors gain motivation to change through experiencing incongruence between
their goals and the outcomes of their teaching practices. Key to this initial stage is
Bpsychological safety^ through Benvisioning ways to change that will produce
results that reestablish his or her positive self-image without feeling any loss of
integrity or identity^
(Paulsen and Feldman 1995, p. 12)
. In the next stage,
changing, instructors learn, apply, and reflect on new teaching strategies to help
align their behaviors with desired outcomes. While teaching strategies may be
fluid and changing during this stage, in the final stage, refreezing, either these new
strategies are confirmed through positive feedback and solidified, or the instructor
returns to his or her original strategies.
While all three stages are important, stage two, changing, is perhaps the most
(Connolly and Millar 2006)
. For example, the literature on K-12 professional
development offers extensive evidence on features of workshops that help instructors to
gain knowledge and skills, and to make changes in their classrooms
(e.g., Garet et al.
2001; Wei et al. 2010)
. There is not much literature on features of effective workshops
within higher education
(Connolly and Millar 2006; Council of Scientific Society
. Most of what does exist is descriptions of faculty development
workshops that are associated with high rates of uptake, and most of these reports do
not measure outcomes beyond participant satisfaction with the workshops
Scientific Society Presidents 2012)
. In this paper, we discuss the quality of the
workshops and the elements of successful professional development that facilitate
changing. However, our main focus is on how these workshops supported instructors
through the less-studied stages of unfreezing and refreezing, and the implications for
broadening the adoption of IBL and similar, student-centered strategies in college
mathematics and other disciplines. We present survey data that show participants have
gone through the change process, and use interview and open-ended survey responses
to help explain how the workshops have supported instructors through each of the three
stages of the process.
Context for the Study
Data were collected from three workshops held in the summers of 2010, 2011, and
2012, each 4 or 5 days long. The workshops were part of a jointly funded project at
several ‘IBL Centers’
(Laursen et al. 2014)
around the country. Each workshop was
organized and run independently by experienced IBL instructors at one of the centers.
As a result, the workshops shared some common elements, but each engaged
participants in its own blend of activities, such as watching and analyzing videos of IBL
classes, reading and discussing research articles, listening to plenary talks, participating
in panel discussions with experienced IBL instructors, and developing IBL materials in
small groups. In general, the first and third workshops featured more hands-on, active
sessions, while the second, slightly larger workshop was organized in more of a
conference style with formal talks. All three workshops exemplified characteristics of
effective research-based professional development that have been identified in previous
literature on K-12 teacher development
(Cormas and Barufaldi 2011; Garet et al. 2001)
These included such features as active participation, engaging participants in
discussions of their students’ learning, and promoting participant self-reflection. While these
findings from the K-12 setting also recommend that teacher professional development
interweave a strong focus on important disciplinary ideas with effective teaching
strategies to help students engage with those ideas, the workshops we studied focused
more strongly on pedagogy than on specific content. In Austin’s (2011) synthesis of the
literature, she argues that research suggests this approach may be appropriate for
college-level instructors, who have studied deeply in their discipline and are often
eager to share this deep content knowledge, but may be unsure how to do so effectively.
These workshops focused on helping instructors to plan and teach courses that
successfully engage students with mathematical content in an inquiry-based style. More
detailed descriptions of the workshops are available in a report on the workshop
(Hayward and Laursen 2014)
Participants in each workshop were invited to complete online pre-workshop surveys
and in-person post-workshop surveys on paper. We conducted online follow-up surveys
approximately 15 months after the workshop, so that participants could report on their
teaching in the intervening academic year. Survey instruments are available as an
appendix to the workshop evaluation report
(Hayward and Laursen 2014)
. From the
139 attendees at the three workshops, we received 124 pre-workshop surveys (89 %)
and 125 post-workshop surveys (90 %). For 1-year follow-up surveys, 96 individuals
(69 %) responded. We use the words Bparticipants^ or Battendees^ to refer to all 139
individuals, whereas Brespondents^ refers only to those who completed the surveys. All
surveys included unique, anonymous identifiers that allowed us to match pre-, post-,
and follow-up survey responses for each individual. Because we could match surveys
but not identify individuals, one person who attended two workshops may have been
included twice. Due to incomplete responses on the identifier questions, not all surveys
were successfully matched, but overall rates of response matching were high, as
presented in Table 1. Overall this high response rate means that responses can be fairly
generalized to the workshop population, and thus the findings are not strongly biased
by sub-groups such as adopters versus non-adopters.
The surveys included quantitative items and open-ended questions aimed at both
evaluating workshop delivery and understanding the impact the workshops had on the
participants’ teaching. Items were developed to monitor participants’ self-reported
knowledge, skills, and beliefs about inquiry-based learning, as well as their motivation
to use inquiry methods and their perceptions of the overall quality of the workshop. For
example, on all three surveys, participants assessed their current knowledge of IBL on a
scale of 1 to 4 (1=None, 2=A little, 3=Some, and 4=A lot).
To measure impact of the workshops on their subsequent teaching, we asked
participants to report both directly and indirectly whether or not they had implemented
IBL. We measured implementation directly through a multiple-choice question on the
follow-up survey asking participants if they had implemented no IBL methods, some
IBL methods, one full-IBL class, or more than one full-IBL class. We measured IBL
implementation indirectly through comparing changes in instructors’ reported
frequencies of use of specific teaching practices that were probed on both pre-workshop and
follow-up surveys. Available research indicates that self-report is most accurate when it
is retrospective over a clearly defined time frame, when it is confidential, and when it is
behavioral rather than evaluative
. Therefore, we designed these
indirect measures of teaching practice to ask participants to anonymously report their
frequency of use of eleven behaviors in a class that they had taught recently.
Participants were asked to select a response for each of the eleven practices to indicate if they
use the practice ‘never,’ ‘about once a month,’ ‘about twice a month,’ ‘weekly,’ or
‘every class.’ The eleven behaviors included some that are consistent with
inquirybased learning as presented at the workshops, other behaviors that are characteristic of
other forms of active learning but not necessarily IBL, and some that are characteristic
of lecture-based instruction. The behaviors and the expected changes in these behaviors
upon implementing IBL are detailed in Table 2.
Some practices were considered ‘core IBL’ practices because they should
characterize all variations of IBL communicated in these workshops. ‘Core IBL’ includes
decreased use of instructor activities like lecture and solving problems on the board,
and increased use of student activities like presentations and student discussion.
Other items were classified as ‘preference IBL’ practices, which were consistent with
the set of IBL approaches presented in the workshops, but different IBL instructors
might emphasize them to varying degrees. For example, instructors may vary in how
active a role they take in leading discussions; some instructors use in-class group work
with group presentations, while others have students individually present problems or
proofs worked on outside of class.
We also asked participants about their use of other forms of active learning that are
not necessarily characteristic of IBL as presented at the workshops. In this study, these
items functioned as Bdistractors,^ reflecting teaching practices that may be incorporated
in student-centered classrooms but were not specifically addressed in any of the
Open-ended questions addressed the perceived costs and benefits of using inquiry
strategies and participants’ impressions and learning from the workshop, which helped
to provide more detail and deeper understanding of the factors that affected their use of
IBL practices. Additionally, participants reported personal and professional
demographic information such as career stage, institution type, gender, race, and ethnicity,
so that we could test for possible differences in results among groups.
The 139 participants were all college mathematics instructors who voluntarily
attended a workshop and who came from at least 117 separate institutions around the
United States and Canada. Full demographics are presented in Table 3. Overall, the
participants were diverse in career status and teaching experience, and moderately
diverse by gender and by ethnicity relative to the mathematics faculty overall
Science Foundation 2008a, b)
. Only 13 % of participants reported teaching at a
minority-serving institution. Interestingly, 24 % of participants had taken an
IBLstyle class as a student, and 45 % said they had previously incorporated IBL techniques
in their teaching strategies. However, 46 % of participants reported no prior experience
with IBL as either an instructor or a student.
In addition to the survey data, 16 interviews were conducted with a small subset of
participants after they had completed the follow-up survey. On follow-up surveys, 16
individuals from the first workshop and 11 individuals from the second workshop
expressed interest in taking part in a telephone interview. They were all invited to
participate, and ultimately seven participants from the first workshop were interviewed
by the second author and nine from the second workshop were interviewed by the first
author. All interviews lasted approximately one-hour and were conducted by telephone.
Available demographic information for interviewees is included in Table 3. In general,
the interviewees were representative of the larger group of workshop participants. The
interview participants self-identified as implementing more than one full-IBL course
(25 %), one full-IBL course (19 %), some IBL methods (50 %), or no IBL (6 %).
During these interviews, we asked questions to gain a deeper understanding of
participants’ development as instructors, their views on teaching and learning, and
more detail about their classroom activities and the factors that affected whether or not
they implemented IBL. Interviews were semi-structured so that participants could
reveal their own perspectives instead of fitting their responses into categories
introduced by researchers. Thus we did not ask questions in the same order or with the same
wording in every interview. Some topics arose spontaneously and thus were not
represented in every interview. Prior to data collection, all survey instruments and
interview protocols were reviewed and approved by the Institutional Review Board at
the University of Colorado Boulder.
Survey data were entered into and analyzed with SPSS v. 21
(IBM Corp 2012)
calculated descriptive statistics for all variables, and inferential statistics as appropriate.
Open-ended responses were entered into Microsoft Excel
for common themes.
2 to 5 years
6 to 10 years
Percentages by category may not add to 100 % because some respondents chose not to answer some
questions. Some demographic details are omitted for the small interview sample to protect respondent
We audio-recorded interviews, transcribed them verbatim, and entered them into
NVivo v. 9
(QSR International Pty Ltd 2010)
. We carefully read through and identified
segments of the transcripts that related to specific topics and assigned them a code to
identify that topic. If an individual passage covered multiple topics, we assigned
multiple codes. We coded topics each time participants discussed them, so we
sometimes used a code multiple times over the course of an interview. We organized groups
of codes that shared similar themes into domains
. For example, the
domain of Bimplementation barriers^ included nine codes representing specific barriers.
Analysis was an iterative process, as each transcript brought new detail that might
warrant the development of more specific codes which we then reapplied to earlier
transcripts. The frequencies of use of specific codes and domains give an
approximation of the relative importance of the topics to the respondents. We counted codes and
domains both in terms of the number of interview participants who mentioned a
specific topic and the number of separate comments they made.
While an interview participant might revisit an idea multiple times throughout an
interview, responses to the open-ended survey items were short and less detailed.
Therefore, in open-ended responses, we coded each theme only once as either present
or absent. This is different from interviews, in which we counted the total number of
separate comments made about a particular theme. The first author coded and the third
author reviewed both interview and open-ended responses.
We first report results that indicate the nature and quality of participant learning and
other immediate outcomes from the workshop. Then, we report results on the impact of
the workshop on participants’ subsequent teaching activities. These two sections
provide evidence that participants felt the workshops were well designed and delivered,
and that participants’ teaching practices changed after the workshop. We then report
qualitative results from open-ended survey items and interviews that describe issues for
implementation. In the Discussion section, we use the qualitative results to explain the
quantitative results in terms of
Paulsen and Feldman’s (1995)
Indicators of Workshop Quality and Participant Learning
As a way to monitor the quality of the workshops, we asked participants to rate their
motivation to use IBL, their knowledge of IBL, their level of IBL skill, and their belief
in the effectiveness of IBL on all three surveys using 4-point scales, as reported in
Table 4. We tested change over time by comparing responses on each pair of surveys
(pre to post, post to follow-up, and pre to follow-up) with Wilcoxon Signed Rank tests.
Respondents reported high motivation to use IBL even before the workshops: 29 % felt
‘somewhat motivated’ and 68 % felt ‘highly motivated.’ On average, their motivation
to use IBL rose significantly post-workshop, but then returned to pre-workshop levels
by the 1-year follow-up.
On average, respondents’ knowledge rose significantly following the workshop, and
then remained the same at the 1-year follow-up. Their average rating of their own skill
in inquiry-based teaching rose significantly following the workshop, and rose again by
the 1-year follow-up.
As with motivation, respondents started with strong beliefs in the effectiveness of
IBL. These rose significantly following the workshop, and then decreased by the 1-year
follow-up, though they were still higher than pre-workshop levels. Overall, we interpret
these results to mean that the workshops offered participants a high-quality professional
development experience that yielded cognitive and attitudinal changes of the types the
workshop facilitators sought.
Impact on Teaching Practice
To assess the impact of the workshops on teaching practice, we asked participants to
report their implementation of IBL methods in two different ways. First, we measured
IBL implementation through one direct item on the follow-up survey. In total, 58 % of
the 139 workshop participants reported implementing at least some IBL methods in the
itffssceeevn ,rsseeonpby littleA 64% 8% 9% 25% 44% 53% treovyN 0% 0% 0% iltlittebA 3% 0% 5% /irronduond
e f a
,lilfseeb reecquny eon % % % 6% % % t’onkno 7% % % ttllaao % % % r-sseepon
l F N 9 0 0 2 6 3 D 1 0 2 N 0 0 0 n
year following the workshop they had attended, with 29 % reporting implementing
Bsome IBL methods,^ 14 % reporting Bone full-IBL course,^ and 15 % reporting Bmore
than one full-IBL course.^ Only 8 % of participants reported implementing no IBL
methods, while the remaining 34 % did not respond to this question.
Differences in implementation rates between workshops were significant (χ2
(6, 91) = 13.87, p < 0.05). Pairwise comparisons revealed that implementation rates
for the third workshop were significantly higher than the first and second
workshops, which did not themselves differ significantly. Figure 1 shows
implementation levels for only those who responded to this question, not for all
participants (response rates were similar for each of the three workshops, as noted
in the figure). We found no differences in implementation by any other
As a check on respondents’ direct, self-reported BIBL^ teaching, we also
measured implementation indirectly by comparing pre-workshop teaching
practices with teaching practices 1 year after the workshop. As noted in the Methods
section, the design of these items was consistent with best practice for
selfreporting behaviors. Participants reported the frequency with which they used
certain teaching practices on both pre-workshop and 1-year follow-up surveys.
By comparing matched surveys, we were able to assess changes in teaching
practices that were consistent with the inquiry-based practices presented in the
workshops, and compare these to reports of other forms of active learning that
could be considered controls. Of the 139 participants, 96 (69 %) responded to the
follow-up survey and we matched 69 of those responses (72 %) to pre-workshop
survey responses. This resulted in a net response rate of 50 %.
In Fig. 2, we compare reports of pre-workshop and 1-year follow-up teaching
practices for the 69 respondents with matched surveys. Asterisks indicate
significant changes in these frequencies. Upward arrows indicate increased frequency of
the practice and downward arrows indicate decreased frequency. We tested for
differences in the change in individuals’ teaching practices using Wilcoxon Signed
Ranks tests, and significant changes are detailed in Table 5. The use of other
Pre-workshop teaching practices
One-year follow-up teaching practices
practices did not differ significantly from pre-workshop to 1-year follow-up; they
are shown in Fig. 2 but not in Table 5.
Table 5 Significant changes in frequenciesa of reported teaching practices from pre-workshop to 1-year
1-year followup median frequency
Number of increased ratings
Number of decreased ratings
Number of unchanged ratings
Issues for Implementation
We asked participants on pre- and post-workshop surveys to comment on their
concerns about implementing IBL. These comments help to reveal whether workshops
met participants’ needs and also what post-workshop concerns may have influenced
instructors’ decisions to implement IBL or not. In comparing individuals’ matched
preand post-workshop concerns, we categorized each participant’s concern as ‘dispelled’ if
it was mentioned pre-workshop but not mentioned post-workshop, ‘lingering’ if
mentioned on both, or ‘raised’ if mentioned after but not prior to the workshop. Results
are presented in Table 6.
Prior to the workshops, coverage of material when using IBL methods was the most
common concern (46 participants). Coverage is an issue with IBL since instructors
spend more class time involving students in deep inquiry, and typically cover less
breadth of content than they are accustomed to doing in lecture courses
. Participants mentioned feeling pressure to cover certain topics based on
collegial expectations, standardized tests, and subsequent course requirements. They
also felt pressure to expose science and engineering students to a large set of
computational techniques, rather than seeing the learning goal for these students as conceptual
understanding. A large number of coverage concerns were dispelled (31 participants),
while smaller numbers were raised (14 participants) or lingering (15 participants).
Organizers and participants often discussed coverage at the workshops, yet it remained
a concern for many participants. Concerns about student resistance were also common
on both surveys (29 raised, 21 dispelled, 10 lingering). The third most common concern
was participants’ own lack of skill to implement IBL (23 raised, 8 dispelled, 7
The interview data echoed these themes but also provided more detail about the
factors that affected participants’ actual implementation of IBL, rather than just
difficulties they worried they might face. We organized these interview comments into three
groups: supports, barriers, and areas of alertness. We differentiated ‘barriers’ and ‘areas
of alertness’ by whether the factors actively discouraged instructors as they tried to
implement IBL (barriers) or were merely issues about which they wanted to be
conscientious while implementing (areas of alertness); some factors were included in
both groups based on how participants discussed the issue.
Interview participants made 96 comments about barriers to implementing IBL in
their classroom. The most common were student resistance (12 interviewees, 22
comments), instructors’ fears (9 interviewees, 18 comments), and tenure/evaluation
concerns (8 interviewees, 11 comments). Some instructors commented that student
resistance was lower when they made an effort at the beginning of the term to ‘market’
the course. This meant informing students what an inquiry-based course would be like
and why they chose to use teaching methods that, for many students, were different
from the more familiar lecture-based courses. Instructors shared their own fears such as
BIBL is hard^ or being scared of Brelinquishing control^ of their classroom.
Implementation supports (16 interviewees, 126 comments) were mentioned slightly
more often than barriers. The most commonly discussed was departmental support (16
interviewees, 45 comments), but additional professional development (15 interviewees,
38 comments) and IBL mentors or colleagues (12 interviewees, 34 comments) were
also viewed as helpful by instructors for implementing IBL. Participants perceived
departmental support in different ways. Some felt a general freedom or openness to
innovation within their departments, while others described specific supports, such as
other instructors using IBL at their institution. This helped because students were
already used to, and expected, IBL techniques in their mathematics classes, so they
were less likely to resist. On the follow-up surveys, 85 to 90 % of participants reported
at least moderate support from each of three distinct but important groups: departmental
colleagues, department chairs, and provosts or deans.
The most common factors cited by interviewees as shaping their implementation
were not necessarily barriers or supports, but simply things they had learned to be alert
to when implementing IBL. In total, all 16 participants commented a total of 417 times
on these factors. These included topics such as finding or creating IBL-appropriate
materials (16 interviewees, 62 comments) and the different role of the instructor in an
IBL class as compared to that in a lecture class (9 interviewees, 43 comments).
Situational considerations included deciding how to implement IBL depending on the
level (15 interviewees, 39 comments) or size of the class (15 interviewees, 28
comments). These factors highlight aspects of IBL teaching that differ from lecture-based
teaching and thus require extra thought and consideration.
Although workshop organizers had proposed to follow up with participants, only the
organizers of the third workshop actively did so, formally engaging participants
through an email listserv for 1 year following this workshop. Participants and
organizers both contributed to discussion on the group list, sending a total of 191 messages.
Of these messages, 19 were sent by the workshop organizers specifically to prompt
participants to contribute. These prompts were often followed by flurries of listserv
activity, as participants used the list to check in and cheer each other on, share ideas,
and pose and respond to difficulties individual instructors were facing with
implementing IBL in their own classrooms. By the end of 1 year of formal email
follow-up, 62 % of workshop attendees had sent at least one message to the list.
Workshop organizers also provided individual consultation and feedback to a number
of participants, but we did not track this activity.
Overall, almost half (48 %) of respondents from this third cohort reported that the
email list was either a ‘great help’ (32 %) or ‘much help’ (16 %). Only 12 % of
respondents felt the email list was not helpful. Many respondents (74 %) from this
workshop also said they kept in touch with other participants, a proportion that was
slightly higher (but not significantly higher) than the same proportions for the first two
cohorts, 62 and 61 % of respondents, respectively. We also compared directly-reported
implementation levels across all three cohorts (none, some methods, one course, more
than one course) with survey items related to follow-up activities and support from
colleagues, but found no significant differences among workshop cohorts.
To help frame and interpret the results, we use
Paulsen and Feldman’s (1995)
three-stage theory of instructor change. It helps to explain the process by which
instructors made a transition from more traditional, lecture-based approaches to a
more inquiry-based approach through the stages of (1) unfreezing, (2) changing,
and, (3) refreezing.
Broad Definitions and Impact on Teaching Practice Overall, these workshops were
effective in helping instructors through this transition as they adopted IBL teaching
practices. Among workshop participants, there was a high rate of uptake of IBL
approaches, reported directly (at least 58 % of attendees), and indirectly through
changes in teaching practices from pre-workshop surveys to 1-year follow-up surveys.
While IBL may involve both pedagogical and curricular changes, we focused on
pedagogical changes for two reasons. First, pedagogical changes were the focus of
the workshops, and second, curricular changes may develop more slowly as instructors
select and reframe topics to highlight in students’ inquiry work, so they cannot be
measured as effectively in a short timeframe.
‘Core IBL’ practices are found in all variations of IBL that were communicated in
these workshops, and indeed these showed significant changes in instructor use. These
included decreased use of instructor-led activities of lecturing and solving problems on
the board, and increased use of student-led activities including whole-class discussions,
small group discussions, and student presentations of problems or proofs. The
frequencies of use of ‘preference IBL’ practices, including instructor-led discussions and
students working in groups, showed non-significant increases, suggesting that a
minority of instructors made use of these in implementing IBL.
Instructor-reported frequencies of other forms of active learning that are not
necessarily characteristic of IBL remained quite consistent from pre-workshop to 1-year
follow-up. These included instructors asking conceptual questions leading to
generalizations, students solving problems alone, students writing in class, and students using
computers. The lack of change in instructors’ use of these methods suggests that
respondents are not making general or broad claims about their use of
studentcentered learning approaches that they may perceive as socially desirable, but are
instead selectively reporting specific practices they actually used.
The distinctions between types of teaching practices are important in light of Paulsen
and Feldman’s theory of instructor change. Their theory suggests that during the
unfreezing stage, instructors gain motivation to change when certain criteria are met,
notably, psychological Bsafety.^ This occurs when instructors can envision ways to
change that achieve their desired outcomes in a manner consistent with their self-image
(Paulsen and Feldman 1995)
. While changes in ‘core IBL’ practices were common for
most participants, the freedom to choose whether and how to incorporate ‘preference
IBL’ practices may be important to meeting this safety criterion. Comments from the
interviews supported the importance of choice. For example, one participant was struck
by Bhow enthusiastic everyone [at the workshop] was about teaching and helping other
people learn what IBL is about and how to integrate it into your classroom.^ However,
he Btuned out^ one presenter who he found Baggressive^ in communicating that Bthis is
the only way to go, and that if you don’t do this, then it somehow diminishes your
classroom.^ Another participant explained that seeing IBL as a spectrum of related
practices Bwas kind of a big moment for me because it made it seem less scary. …
Feeling like I can pick and choose aspects of it, and find something on the spectrum
that I feel comfortable with, was empowering.^
These findings suggest that portraying IBL as a broad, inclusive set of practices,
rather than a rigid and prescriptive method, may be essential for helping new instructors
during the unfreezing stage, as it helped them to envision a way to change their teaching
that was consistent with their own self-image and thus felt safe. This also gave
participants the freedom to use a Bhybrid^ style whereby they incorporated some IBL
strategies into a more traditional class, offering a more feasible and less daunting entry
into IBL that could then lead to Bfull IBL.^ Biology education researchers have called
this process Bphased inquiry^ and suggest that it is Ban important step toward
expanding adoption of inquiry practices in college science courses^
(Yarnall and Fusco
2014, p. 56)
. However, further longitudinal research is needed to explore how teaching
practices change after instructors take these initial steps to incorporate Bhybrid^ IBL.
Diverse Viewpoints and Context of Implementation In addition to portraying IBL as
a broad, inclusive set of practices, it was also important for workshop attendees to see
IBL being used in a variety of settings. Interview participants described a number of
situational factors that led them to vary the IBL strategies they used, depending on the
level (first-year, sophomore, etc.), size, or the audience (mathematics majors,
preservice teachers, etc.) of their class. As one interview participant explained, seeing a
diversity of IBL practices, practitioners, and situations, was important because it was
Bfrustrating^ when one presenter Bhad so many resources at their disposal that the rest
of us didn’t have, …how many graders and TAs they have and how they keep the class
size small. These were things that just don’t apply to most universities.^ Other
participants made positive comments about the variety of opinions and viewpoints
shared in the workshop, such as one who identified the best aspect of the workshop as
Ba good diversity of ideas and approaches which I feel that I can adapt to my own
teaching. As an inexperienced IBL user, I was very interested in learning from
experts, but I was also interested in meeting people in my situation who I can
identify with and hearing how they have worked through the same problems that
Another participant commented that the workshop Bgave me more ways and more
tools to introduce IBL into [lower-level and pre-service teacher courses].^ As a result,
he was able to incorporate IBL methods into classes he had previously thought could
not be taught with IBL.
From their studies of physics education reform,
Henderson and Dancy (2008)
recommend providing instructors with easily modifiable curricular materials, so that
individual instructors may use their expertise to adapt the materials to their own local
environments. While their recommendation applies to reforms focused on curricular
materials, our findings suggest that this feature of easy portability may also be
important for sharing primarily pedagogical strategies such as IBL. Showing diverse
examples of IBL may have helped participants to see how to customize IBL for their
individual context and thus made implementation more likely.
The workshop leaders’ choice to present IBL as a variety of related approaches was
inviting for participants, but does raise the question of how well their implementations
maintained fidelity of IBL. Studies in physics
(Dancy and Henderson 2010)
(Yarnall and Fusco 2014)
have reported that instructors often adapt and modify
research-based teaching strategies, usually in ways that align more with traditional
methods and reduce the amount of student inquiry. However, IBL in mathematics may
be somewhat robust to variation, as student outcomes are improved over traditional
courses despite notable variations in how IBL is implemented
(Laursen et al. 2014)
Portraying IBL as a spectrum of related practices may have helped participants by
outlining ways in which they could modify IBL methods to fit their context while still
maintaining its core features, including high levels of student inquiry.
Measures of fidelity of implementation may examine either ‘fidelity of structure,’
meaning adherence and duration of use, or ‘fidelity of process,’ including quality of
delivery, and program differentiation – Bwhether critical features that distinguish the
program from the comparison condition are present^
(O’Donnell 2008, p. 34)
pedagogical innovations like IBL, fidelity of process may be more important than
fidelity of structure. We suggest that professional development that communicates
broad, inclusive definitions of IBL in mathematics not only increases instructor uptake,
but may also help to maintain fidelity through outlining allowable modifications that
preserve the core principles of the approach and protect fidelity of process. By
explicitly recognizing acceptable variation in practice, the workshops may have
reduced the likelihood that instructors would make modifications that would reduce
the amount of student inquiry and veer back toward traditional methods.
argues that this process of Breinvention^ is an asset in spreading innovations, as it helps
to reduce mistakes, fits the innovation to local contexts, and makes the innovation more
responsive to changing conditions. Communicating broader and more inclusive
definitions of student-centered strategies in other STEM disciplines may help instructors to
adapt these methods to their classes while maintaining high levels of student inquiry.
Changing: Features of the Workshops and Participant Learning
The workshops supported instructors through the changing stage by providing
participants with information about IBL, opportunities to view IBL in action, discussions
with experienced practitioners, and, in some cases, collaborative work time for
participants to develop their own IBL courses. The week-long duration of these workshops,
as well as thoughtful scheduling of breaks and free time, were important features that
supported participant learning by allowing ample time for participants to process and
reflect, as well as to revisit topics throughout the week. Indeed, participants did report
increased knowledge and skills following these workshops.
The three most common concerns participants mentioned in open-ended comments
(Table 6) are areas where instructors new to IBL struggle and where they may need the
most help during the changing stage. These concerns included lack of skill to
implement IBL, content coverage, and student resistance. Efforts to encourage mathematics
instructors to adopt IBL or similar strategies will need to address these three concerns
and provide strategies to help manage them.
The large number of concerns about instructor skill that were raised (23) following
the workshop may indicate learning rather than unmet needs. Some comments on the
pre-survey support this interpretation, such as B[I’m] not familiar enough with it to have
concerns.^ But, as participants gained more familiarity with IBL during a workshop,
they also learned more about the particular challenges that come along with its use.
In fact, for all but two of the topics mentioned by respondents, the number of new
concerns raised was greater than or equal to the numbers of concerns dispelled or
lingering. Moreover, of the concerns shared on the pre-workshop surveys, 72 % were
dispelled, and only 28 % lingered. While many new concerns were raised on the
postworkshop surveys, rather fewer lingered that were already on instructors’ minds before
the workshop. Again, this suggests an explanation that instructors were learning rather
than expressing needs not addressed by the workshop. The high rate of IBL
implementation indicates that participants did not perceive the remaining concerns as great
enough to deter them from using IBL.
Refreezing: Ongoing support
Ongoing support through the refreezing stage was especially challenging since these
workshops served instructors from geographically diverse institutions, who could not
easily reconnect in person. However,
states that Bexternal networks
of like-minded colleagues… can be important forces in promoting instructional
reform^ as they help instructors to find supportive colleagues (p. 27). Workshop
organizers for the third cohort were able to effectively provide participants with
ongoing support through a group email list. Participants from this workshop reported
higher implementation rates than those from the other two workshops, and many
reported that the email list was helpful. (Ongoing research with other workshops that
incorporate structured follow-up support is testing whether this pattern continues and
how participation may relate to high or low implementation.) Diversity in participants’
institutional origins may have served as a benefit, if participants felt more secure asking
questions or sharing difficulties with outside colleagues than with departmental
colleagues involved in their tenure or promotion process. This may have been especially
relevant given the high proportion of pre-tenure faculty who attended these workshops
(35 %) and the number of concerns shared about evaluation and tenure decisions.
Support and encouragement from fellow workshop attendees may also have supplied
the positive feedback essential to the refreezing stage.
Organizers also helped connect participants with other IBL colleagues through
invitations to an annual IBL-focused conference. Participants supported each other
directly, as some reported staying in contact after the workshops through email, or in
one case, by forming a small, regional IBL group. While such face-to-face connections
were not widespread, they do represent possible ways to support more individuals with
refreezing after workshops are over and participants have returned to their home
institutions. Other options may include hosting occasional online, themed discussions
or involving teaching and learning centers at participants’ home institutions. The New
Faculty Workshops for chemistry professors include these approaches as follow-up
support thought to increase the impact of their workshops
(Stains et al. 2015)
While these findings are encouraging, they do come from self-reported data. Previous
studies have questioned workshop participants’ ability to self-assess their levels of skill
with and knowledge of the practices the workshops aimed to teach
(D’Eon et al. 2008)
One study found that most biology instructors reported using more student-centered
techniques after attending a professional development workshop, but observations
using an inquiry-focused protocol revealed that their classrooms were still largely
(Ebert-May et al. 2011)
. However, these researchers found that the extent
of student-centered teaching was inversely related to both teaching experience and class
size. In our sample, about half (47 %) of the teachers had 5 years or less of teaching
experience, and 94 % of the classes they reported on were 35 students or less, therefore
similar to the groups with higher implementation rates in the Ebert-May et al. study.
Our sample may also be comparable to a sample of instructors from workshops
designed for physics and astronomy faculty members in their first few years of teaching
. Following these workshops, only 1 % rated their teaching as
Bhighly traditional^ while the majority (roughly 60 %) reporting that their teaching
was Bmostly traditional with some alternative features.^ These participants’
selfreported ratings were corroborated by ratings from their department chairs.
A recent high-level report (AAAS 2012) acknowledged that measuring
undergraduate teaching practices remains a difficult endeavor, but also provided examples for
how to avoid some common biases in self-reported data. Our survey design is
consistent with these examples; for example, we selected a variety of teaching practices, i.e.,
behaviors, and assessed frequency of use, rather than asking for self-rating of skill or
expertise with the practice. In addition, we used three approaches in order to triangulate
participants’ reported implementation: direct report, indirect report, and interview; and
we found high agreement among them. These three methods all rely on self-report by
the same instructors; their reports are self-consistent but may nonetheless be limited in
objectivity. Clearly further research is needed to address the validity of self-reported
data about teaching strategies by comparison with observation or other external sources
of data. Further research is also needed to identify factors that influence the level of
implementation for workshop audiences of varied types.
Regardless of the method of data collection, measuring the effect of faculty
development workshops on participants’ teaching practices is difficult. In fact, there are few
observational studies in higher education, and many self-report-based evaluations do
not measure anything beyond participants’ immediate satisfaction with the programs
(Felder et al. 2011; Council of Scientific Society Presidents 2012)
. This study does not
seek to capture the full complexity of teaching
(Hora and Ferrare 2013)
nor does it
intend to assess participants’ degree of skill in implementing IBL approaches. Rather,
we argue, while the process of becoming skilled with a new teaching style like IBL
may take years, the first steps may be shifts in instructors’ choice of instructional
strategies. [For examples of this process, see
(Gonzalez 2013; Retsek 2013)
Specifically, if the workshops were effective in shifting instructors’ practice, we expected to
see decreased use of instructor-centered activities and increased use of student-centered
activities—whether or not these were yet implemented with high skill. Therefore,
capturing participants’ initial efforts to incorporate IBL practices into their teaching
offers a measure of the first type of change to instructors’ practice that may be
anticipated as a result of professional development.
Moreover, changes in instructors’ choice of specific teaching strategies can be detected
in the short term, after their initial efforts to implement. More nuanced measurement of
skill levels and the effects on student learning may not be observable until participants
have had more time to practice and hone their craft—well beyond the typical time frame
of studies intended to evaluate impact of professional development. Future research
should examine instructors’ adoption of inquiry-based learning over longer time frames
and the role played by ongoing support from workshop leaders and colleagues.
Certainly one influence on the high implementation rate seen here is that these
participants were volunteers who were already motived to use IBL, and indeed, some
already had tried it. Moreover, while, on average, participant motivation levels did not
change in the long term, motivation did spike immediately following the workshop.
This motivational spike may be instrumental in getting participants to start
implementing new student-centered strategies in their own classrooms and to continue
learning more on their own
. Generating motivation may be a bigger
challenge among instructors who are compelled to participate, for instance in K-12
settings where professional development is often required in order to comply with state
and federal standards (Wei et al. 2010).
In addition to internal motivation, external resistance is often cited as a barrier to
implementation of student-centered approaches. While some participants in this study
worried about resistance to IBL within their departments, on follow-up surveys most
reported they had supportive colleagues. Moreover, interview participants made more
comments about supports for implementation than about barriers. The fact that all
participants could commit a week to attend an IBL workshop suggests that most likely
had some explicit or implicit support from their colleagues and departments, or at least
worked in an environment open to innovation. Participants may have experienced
resistance more indirectly through tenure processes that dissuade innovation in teaching
(Brownell and Tanner 2012)
, or through norms related to course content and those
implicit in shared course syllabi
(Hora and Anderson 2012)
The participants at these workshops were already motivated to implement IBL and
worked in supportive, open environments; that is not true for all mathematics
instructors. Yet getting interested instructors to apply IBL teaching methods is an important
first step toward wider uptake. Even at the K-12 level, there is little research on teacher
professional development involving non-volunteers
(Bobrowsky et al. 2001)
learning from the experiences of motivated, supported participants may provide valuable
lessons that can then be leveraged to meet the challenge of expanding the use of IBL
methods among other instructors who are initially less familiar, supported, or motivated
to use IBL. Indeed, motivation is likely an even bigger challenge with non-volunteer
participants, and the findings on unfreezing presented here may be especially relevant
for these groups.
Due to the central role of mathematics in many college majors, improving mathematics
instruction by fostering broader uptake of IBL and similar evidence-based teaching
strategies will have positive ramifications for a very large number of students.
Measuring instructor adoption of these teaching strategies is challenging, as self-report
measures may be biased and observation is expensive and difficult. However,
selfreport measures can be carefully developed to reduce bias by focusing on retrospective,
anonymous, and non-evaluative reporting of behaviors. Self-report measures are also
well suited to allow researchers and evaluators to measure initial changes in behavior
within the short timeframes and small budgets of many grant-funded projects when
observation may not be feasible.
Our findings imply that future efforts to spread IBL must prepare instructors for
dealing with three main concerns: lack of instructor skill to implement IBL, student
resistance, and content coverage. Although these recommendations derive from
workshops espousing IBL methods in mathematics and taking a particular form, they are
applicable to other inquiry-based teaching strategies as well. For example, student
resistance is a concern with any teaching strategy that diverges from the traditional
lecture-based courses to which students are accustomed
(Seymour 2005; Welch 2012)
and content coverage has been identified as a major concern with inquiry-based
instruction in biology (Yarnall and Fusco 2014).
Across disciplines, workshops are seen as an effective professional development
strategy. In fact, NSF program directors interviewed by Khatri and colleagues
et al. 2013)
regard Bmulti-day, immersive experiences with follow-up interaction with
the PI as participants implement the new strategy^ as the most effective propagation
strategy for educational innovations, and a recent report on improving engineering
education likewise lists faculty development as a critical strategy
. Our findings support this view with evidence that multi-day,
immersive workshops contributed to high rates of implementation, especially when
paired with strong and collegial follow-up support. Communicating broad, inclusive
definitions from a diverse group of workshop facilitators is also important and clearly
related to the workshops’ impact on participants’ adoption of IBL teaching approaches.
Defining student-centered teaching in any discipline by its core features and desired
outcomes rather than through rigid use of specific techniques may lead to increased
adoption as well as help to maintain fidelity of implementation by providing options for
how instructors can adapt the strategies to fit their own classes while maintaining high
levels of student inquiry.
Acknowledgements The evaluation and research was supported by the National Science Foundation
(DUE0920126) and a grant from the Educational Advancement Foundation. Any opinions, findings, and
conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect
the views of the National Science Foundation or the Educational Advancement Foundation. The authors wish
to thank Tim Whittemore for his help with data collection, as well as the workshop organizers and all of the
Compliance with Ethical Standards
Conflict of Interest
The authors report no conflict of interest.
American Association for the Advancement of Science . ( 2012 ). Describing & measuring undergraduate STEM teaching practices: A report from a national meeting on the measurement of undergraduate science, technology, engineering and mathematics (STEM) teaching . 2013 : AAAS.
Austin , A. E. ( 2011 ). Promoting evidence-based change in undergraduate science education . Paper commissioned by the National Academies' Board on Science Education. Retrieved December 15 , 2011 , from: http://www7.nationalacademies.org/bose/DBER_Austin_March_Paper.pdf
Bobrowsky , W. , Marx , R. , & Fishman , B. ( 2001 ). The empirical base for professional development in science education: Moving beyond volunteers . In Annual Meeting of the National Association of Research in Science Teaching , St. Louis, Missouri.
Brownell , S. E. , & Tanner , K. D. ( 2012 ). Barriers to faculty pedagogical change: lack of training, time, incentives, and… tensions with professional identity? CBE-Life Sciences Education , 11 ( 4 ), 339 - 346 .
Connolly , M. R. , & Millar , S. B. ( 2006 ). Using workshops to improve instruction in STEM courses . Metropolitan Universities , 17 ( 4 ), 53 - 65 .
Coppin , C. A. , Mahavier , W. T., May , E. L. , & Parker , E. ( 2009 ). The Moore Method: A pathway to learnercentered instruction (MAA Notes # 75 ). Washington, DC: Mathematical Association of America.
Cormas , P. C. , & Barufaldi , J. P. ( 2011 ). The effective research-based characteristics of professional development of the national science foundation's GK-12 program . Journal of Science Teacher Education , 22 ( 3 ), 255 - 272 .
Council of Scientific Society Presidents . ( 2012 ). The role of scientific societies in STEM faculty workshops: A report of the May 3, 2012 meeting . Washington, DC: American Chemical Society.
Dancy , M. , & Henderson , C. ( 2010 ). Pedagogical practices and instructional change of physics faculty . American Journal of Physics , 78 ( 10 ), 1056 - 1063 .
D'Eon , M. , Sadownik , L. , Harrison , A. , & Nation , J. ( 2008 ). Using self-assessments to detect workshop success: do they work? American Journal of Evaluation , 29 ( 1 ), 92 - 98 .
Desimone , L. ( 2009 ). Improving impact studies of teachers' professional development: toward better conceptualizations and measures . Educational Researcher , 38 ( 3 ), 181 - 199 .
Ebert-May , D. , Derting , T. L. , Hodder , J. , Momsen , J. L. , Long , T. M. , & Jardeleza , S. E. ( 2011 ). What we say is not what we do: effective evaluation of faculty professional development . BioScience , 61 ( 7 ), 550 - 558 .
Ellis , J. , Kelton , M. L. , & Rasmussen , C. ( 2014 ). Student perceptions of pedagogy and associated persistence in calculus . ZDM: The International Journal on Mathematics Education , 46 ( 4 ), 661 - 673 .
Fairweather , J. ( 2008 ). Linking evidence and promising practices in science, technology, engineering, and mathematics (STEM) undergraduate education: A status report for the National Academies Research Council Board of Science Education . Retrieved April 23 , 2011 , from http://www7.nationalacademies.org/ bose/Fairweather_CommissionedPaper.pdf.
Felder , R. M. , Brent , R. , & Prince , M. B. ( 2011 ). Engineering instructional development: programs, best practices, and recommendations . Journal of Engineering Education , 100 ( 1 ), 89 - 122 .
Garet , M. S. , Porter , A. C. , Desimone , L. , Birman , B. F. , & Yoon , K. S. ( 2001 ). What makes professional development effective? Results from a national sample of teachers . American Educational Research Journal , 38 ( 4 ), 915 - 945 .
Gonzalez , J. J. ( 2013 ). My journey with inquiry-based learning . Journal on Excellence in College Teaching , 24 ( 2 ), 33 - 50 .
Freeman , S. , Eddy , S. L. , McDonough , M. , Smith , M. K. , Okoroafor , N. , Jordt , H. , et al. ( 2014 ). Active learning increases student performance in science, engineering, and mathematics . Proceedings of the National Academy of Sciences , 201319030 .
Hassi , M. L. , & Laursen , S. L. ( 2015 ). Transformative learning personal empowerment in learning mathematics . Journal of Transformative Education. doi:10 .1177/1541344615587111.
Hayward , C. N. , & Laursen , S. ( 2014 ). Collaborative research: Research, dissemination, and faculty development of inquiry-based learning (IBL) methods in the teaching and learning of mathematics , Cumulative evaluation report: 2010-2013. Retrieved October 01 , 2014 , from Ethnography & Evaluation Research: http://www.colorado.edu/eer/research/profdev.html.
Henderson , C. ( 2008 ). Promoting instructional change in new faculty: an evaluation of the physics and astronomy new faculty workshop . American Journal of Physics , 76 ( 2 ), 179 - 187 .
Henderson , C. , & Dancy , M. H. ( 2007 ). Barriers to the use of research-based instructional strategies: the influence of both individual and situational characteristics . Physical Review Special Topics - Physics Education Reform , 3 ( 2 ), 0201102 .
Henderson , C. , & Dancy , M. H. ( 2008 ). Physics faculty and educational researchers: divergent expectations as barriers to the diffusion of innovations . American Journal of Physics , 76 ( 1 ), 79 - 91 .
Henderson , C. , & Dancy , M. H. ( 2011 ). Increasing the impact and diffusion of STEM education innovations . Retrieved July 1 , 2013 , from Comissioned paper for Forum on Characterizing the Impace and Diffusion of Engineering Education Innovations: http://www.nae.edu/File.aspx?id= 36304 .
Hora , M. T. , & Anderson , C. ( 2012 ). Perceived norms for interactive teaching and their relationship to instructional decision-making: a mixed methods study . Higher Education , 64 ( 4 ), 573 - 592 .
Hora , M. T. , & Ferrare , J. J. ( 2013 ). Instructional systems of practice: a multidimensional analysis of math and science undergraduate course planning and classroom teaching . Journal of the Learning Sciences , 22 ( 2 ), 212 - 257 .
IBM Corp. ( 2012 ). SPSS statistics, version 21 . Armonk: IBM Corp.
Jamieson , L. H. , & Lohmann , J. R. ( 2012 ). Innovation with impact: Creating a culture for scholarly and systematic innovation in engineering education . Washinton, DC: American Society for Engineering Education.
Khatri , R. , Henderson , C. , Cole , R. , & Froyd , J. ( 2013 ). Successful propagation of educational innovations: Viewpoints from principal investigators and program directors . Proceedings of the 2012 Physics Education Research Conference . 1513 , pp. 218 - 221 . American Institute of Physics.
Kogan , M. , & Laursen , S. L. ( 2014 ). Assessing long-term effects of inquiry-based learning: a case study from college mathematics . Innovative Higher Education , 39 ( 3 ), 183 - 199 .
Lattuca , L. R. , Bergom , I. , & Knight , D. B. ( 2014 ). Professional development, departmental contexts, and use of instructional strategies . Journal of Engineering Education , 103 ( 4 ), 549 - 572 .
Laursen , S. L. ( 2013 ). From innovation to implementation: Multi-institution pedagogical reform in undergraduate mathematics . In D. King, B. Loch , & L. Rylands (Eds.), Proceedings of the 9th DELTA conference on the teaching and learning of undergraduate mathematics and statistics , Kiama, New South Wales, Australia, 24 - 29 November 2013 . Sydney: University of Western Sydney, School of Computing, Engineering and Mathematics, on behalf of the International Delta Steering Committee.
Laursen , S. L. , Hassi , M. L. , Kogan , M. , & Weston , T. J. ( 2014 ). Benefits for women and men of inquirybased learning in college mathematics: a multi-institution study . Journal for Research in Mathematics Education , 45 ( 4 ), 406 - 418 .
Lewin , K. ( 1947 ). Group decision and social change . Readings in Social Psychology , 3 , 197 - 211 .
Mahavier , W. S. ( 1999 ). What is the Moore method? Problems , Resources, and Issues in Mathematics Undergraduate Studies, 9 ( 4 ), 339 - 354 .
McCann , T. M. , Johannessen , L. R. , Kahn , R. , & Smagorinsky , P . (Eds.). ( 2004 ). Reflective teaching, reflective learning: How to develop critically engaged readers, writers, and speakers . Portsmouth: Heinemann.
Microsoft. ( 2011 ). Microsoft Excel for Mac . Redmond, WA.
Moon , J. A. ( 2004 ). A handbook of reflective and experiential learning: Theory and practice . New York: RoutledgeFalmer.
National Science Foundation. (2008a). TABLE 3 . Employed doctoral scientists and engineers in 4-year educational institutions, by broad field of doctorate, sex, faculty rank , and years since doctorate: 2008. Retrieved July 29 , 2013 , from Characteristics of Doctoral Scientists and Engineers in the United States: 2008 : http://www.nsf.gov/statistics/nsf13302/pdf/tab3.pdf.
National Science Foundation. (2008b). TABLE 4 . Employed doctoral scientists and engineers, by selected demographic characteristics and broad field of doctorate: 2008 . Retrieved July 29 , 2013 , from Characteristics of Doctoral Scientists and Engineers in the United States: 2008 : http://www.nsf.gov/ statistics/nsf13302/pdf/tab4.pdf.
O'Donnell , C. L. ( 2008 ). Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K-12 curriculum intervention research . Review of Educational Research , 78 ( 1 ), 33 - 84 .
Paulsen , M. B. , & Feldman , K. A. ( 1995 ). Taking Teaching Seriously: Meeting the Challenge of Instructional Improvement (ASHE-ERIC Higher Education Report No. 2 , 1995 ). Washington D.C: ERIC Clearinghouse on Higher Education .
Prince , M. , & Felder , R. ( 2007 ). The many facets of inductive teaching and learning . Journal of College Science Teaching , 36 ( 5 ), 14 - 20 .
QSR International Pty Ltd. ( 2010 ). NVivo qualitative data analysis software, version 9 .
Retsek , D. Q. ( 2013 ). Chop wood, carry water, use definitions: survival lessons of an IBL rookie . Problems, Resources, and Issues in Mathematics Undergraduate Studies, 23 ( 2 ), 173 - 192 .
Rogers , E. M. ( 2003 ). Diffusion of innovations (5th ed .). New York: Simon and Schuster.
Savin-Baden , M. , & Major , C. H. ( 2004 ). Foundation of problem-based learning . Maidenhead: Open University Press.
Seymour , E. ( 2005 ). Partners in innovation: Teaching assistance in college science courses . Lanham: Rowman & Littlefield Publisher , Inc.
Seymour , E. , & Hewitt , N. M. ( 1997 ). Talking about leaving: Why undergraduates leave the sciences . Boulder: Westview Press.
Spradley , J. ( 1980 ). Participant observation . New York: Holt, Rinehart, and Winston.
Stains , M. , Pilarz , M. , & Chakraverty , D. ( 2015 ). Short and long-term impacts of the Cottrell scholars collaborative new faculty workshop . Journal of Chemical Education , 92 ( 9 ), 1466 - 1476 .
Stigler , J. W. , Givvin , K. B. , & Thompson , B. J. ( 2010 ). What community college developmental mathematics students understand about mathematics . MathAMATYC Educator , 1 ( 3 ), 4 - 16 .
Wei , R. C. , Darling-Hammond , L. , & Adamson , F. ( 2010 ). Professional development in the United States: Trends and challenges . Dallas: National Staff Development Council.
Welch , A. J. ( 2012 ). Exploring undergraduates' perceptions of the use of active learning techniques in science lectures . Journal of College Science Teaching , 42 ( 2 ), 80 - 87 .
Yarnall , L. , & Fusco , J. ( 2014 ). Applying the brakes: how practical classroom decisions affect the adoption of inquiry instruction . Journal of College Science Teaching , 43 ( 6 ), 52 - 57 .
Yoshinobu , S. , & Jones , M. ( 2012 ). The coverage issue . Problems, Resources, and Issues in Mathematics Undergraduate Studies, 22 ( 4 ), 303 - 316 .
Yoshinobu , S. , & Jones , M. ( 2013 ). An overview of inquiry-based learning in mathematics . Wiley Encyclopedia of Operations Research and Management Science , 1 - 11 .