5 Strategies to Effectively Monitor and Evaluate Program Implementation

Many of our studies involve monitoring and evaluating program implementation to learn more about how programs are delivered and the degree to which they are delivered as intended. Examining program implementation helps program developers, stakeholders, and evaluators better understand how certain factors (e.g., adherence, dosage, quality of delivery, participant engagement, modifications) might influence a program’s intended outcomes.

At Magnolia Consulting, we find the following five strategies help us to effectively monitor and evaluate implementation:

    • Understand the theory behind the program. We recommend reviewing the program’s logic model to ensure a clear understanding of the theory behind the program. A logic model provides a visual description of how critical program components are expected to result in short-, intermediate-, and long-term outcomes. This can help evaluators understand which program components should be implemented and monitored during the study.
    • Attend program training. When feasible, we recommend evaluators attend program training(s). Attending program training(s) can help evaluators learn about different program components, better understand how each component should be delivered, and become familiar with which program resources are available to support implementation (e.g., English language supports). This experience can help evaluators understand and identify when implementation is or is not going as intended.
    • Align evaluation efforts to implementation guidelines. When possible, we recommend aligning evaluation efforts to program implementation guidelines. Implementation guidelines provide detailed guidance on the expected critical program features and the extent to which each component should be implemented. Along with providing guidance to those delivering programs, they can also help evaluators determine if the program is actually implemented with fidelity to the guidelines.
    • Use multiple measures. We recommend selecting or creating multiple measures to properly monitor and assess various aspects of program implementation. For example, we use a variety of measures to monitor program implementation based on the study’s goals and budget, such as implementation logs, classroom observations, surveys, interviews, and program usage data. The use of multiple measures decreases bias and allows for response validation through triangulation (i.e., cross-verification from two or more sources), which helps ensure an accurate assessment of program implementation.
    • Keep track of response rates and missing data. We recommend tracking implementation data regularly to avoid missing data. For example, if a study uses weekly implementation logs, response rates and missing data should be monitored on a weekly basis to ensure that measures are completed in a timely manner. Complete data sets provide evaluators with important, and more valid, information regarding program implementation than those with missing data points.

Cultivating a Common Language: The 2019 Magnolia Staff Retreat

Enjoying our time together as a group!

At Magnolia, one of our core values is cultivation. This means we create an environment where new ideas, opportunities, and innovation thrive, which benefits both our team and the clients we serve. This value inspired the theme of our May 2019 staff retreat: intentional cultivation. During our four-day retreat, the team unexpectedly cultivated a second theme—using a common language. It emerged most prominently in two of our sessions.  

Using a common language to improve our processes

On the first day of our retreat, Lisa Cooper Ellison, a Charlottesville writer and editor, provided a half-day workshop on writing best practices. Initially, we thought we would learn tips for writing more clearly and succinctly, but the workshop ended up being about so much more. 

Lisa shared the biggest takeaway within the first few minutes of the workshop:

In order to work more efficiently as a team of writers, we needed to develop a common language for approaching the writing process.

She introduced us to three phases in the writing process: the child, parent, and adult. The phases come from Dinty Moore’s book The Story Cure, and while this book is geared towards creative nonfiction writers, the phases are applicable to a variety of settings. In the child phase, we work to nonjudgmentally brainstorm ideas. Then the adult shapes those ideas into a clear message. Finally, the parent polishes the work for publication.

Click on the image for a brief explanation of each writing phase.

As a team, we review each other’s writing in either the adult or parent phase. In the adult phase, we ensure the writing has clarity and is meeting its intended purpose. In the parent phase, we examine the writing for logic flaws, jargon, and the accuracy of our content. During the workshop, we discovered we weren’t leaving enough room for the child phase which is where many of our best ideas come to life. We also realized the importance of explicitly communicating our needs when sending our work to others for review. For example, if the writer is still exploring new ideas and creative avenues (child’s phase) this information must be shared with the reviewer. Otherwise, she might not keep her parent’s eye (which tends to line edit) shut. 

Using a common language to solve problems

On the third day of our retreat, we went to Triple C Camp, a local camp that provides team building workshops. We played several fun and challenging games that helped us learn more about ourselves as individuals and how we work together. During one of these games, we discovered that the most effective way to address our challenges was to develop a common language. We separated into two teams and competed to see which team could re-create a Lego sculpture most accurately. Only one person on each team could see the model sculpture. That person shared instructions about the size, color, and orientation of each Lego piece making up the sculpture to the next person. These instructions were passed down the line telephone-style, and the last person had to put the Lego piece in the correct spot. Because we developed a common language to describe each piece and its orientation, we came closer to replicating an identical sculpture than any other team in Triple C Camp history!

Using a common language whenever possible benefits our team. Moreover, there is a clear connection between communication and intentional cultivation. When we develop a common language around our activities, problems, and processes, we are more intentional about how we work as an organization.

Every year we leave the retreat grateful for the opportunities to connect, have fun, and cultivate learning together, and this year was no different!

Sexual Abuse by School Employees: A Blog on Insufficient State Laws and Policies

All too regularly, we see a news headline that a school employee has engaged in sexual abuse or misconduct with students in a K–12 school. Often, an accused teacher will be transferred to three different schools before they are reported to the police (GAO, 2010). Sexual offenders can end up in schools due to absent, limited, or inconsistent background checks during the hiring process and the absence of a national database of disciplinary actions against licenses or certifications. In addition, because of potential stigma, reputation concerns, and possible legal consequences, school officials sometimes fail to report school employee sexual misconduct to law enforcement despite mandatory reporting laws (Grant, Wilkerson, Pelton, Cosby, & Henschel, 2017). Some school officials avoid these consequences by making confidentiality agreements with offenders. These private settlements, known as “passing the trash” or “the lemon dance,” allow known sexual offenders to discretely leave a school district to embark on employment opportunities in other K–12 school systems.

In 2015, Every Student Succeeds Act (ESSA) enacted a provision (“Prohibition on Aiding and Abetting Sexual Abuse”) that legally prevents school districts from entering into confidentiality agreements with school personnel who engage in sexual misconduct with students. From October 2016 to January 2017, Magnolia Consulting sought to determine how many states had laws or policies that included this provision to ban passing the trash.1 We found that over half of the state representatives cited laws and codes of ethics that included information about prohibiting sexual abuse and misconduct, but only four states (Washington, Oregon, Pennsylvania, and Connecticut) had laws that directly addressed the ESSA provision. At the time of the study, an additional seven states were drafting language or had requested model language for drafting bills, and 39 states lacked any legislation for “Prohibition on Aiding and Abetting Sexual Abuse.”

In the states with no legislation, 24 state representatives noted state laws or codes of ethics they believed demonstrated how their state already adhered to the federal policy. For instance, state representatives referred to laws on state background checks, mandatory reporting of child abuse, and license or certification revocation and other disciplinary actions. While these policies help identify cases of abuse and prevent offenders from getting licensed, they do not prohibit school systems from assisting an offender in obtaining a new job. Since ESSA state plans were still being drafted at the time of the study, additional research is needed to monitor how states’ compliance with the aiding and abetting ESSA provision has developed and been implemented since 2017.

Having state legislation include this ESSA provision is a first step toward preventing continued sexual abuse and misconduct of K–12 students by school employees. To learn more about school employee sexual abuse and misconduct and to review studies that we conducted in this topic area, please download a copy of the full article, visit our Publications and Reporting page, and visit Stop Educator Sexual Abuse, Misconduct, and Exploitation (S.E.S.A.M.E.net).

1This project was supported by Award No. 2015-CK-BX-0009 awarded by the National Institute of Justice, Office of Justice Programs, U.S. Department of Justice.

 References:
Government Accountability Office. (2010). K–12 education: Selected cases of public and private schools that hired or retained individuals with histories of sexual misconduct. United States Government Accountability Office, GAO-11-200. Retrieved from http://www.gao.gov/products/GAO-11-200
 
Grant, B. E., Wilkerson, S. B., Pelton, deK., Cosby, A. C., & Henschel, M. M. (2017). A Case Study of K–12 School Employee Sexual Misconduct: Lessons Learned from Title IX Policy Implementation. Charlottesville, VA: Magnolia Consulting, LLC.

 

Increase Online Survey Response Rates with These Seven Tips

At Magnolia Consulting, we often use online surveys as a data collection tool in our evaluation studies. Online surveys are an efficient and effective way to gather information from participants to answer evaluation questions. While survey response rates can be impacted by many different factors, it’s important to aim for a high response rate. A high survey response rate gives credibility to survey data, whereas a low response rate may pose several challenges, such as (a) decreasing the statistical power of the data, (b) undermining the ability to generalize the results to a larger population, and (c) indicating a nonresponse bias within the sample, meaning that survey respondents differ in meaningful ways from nonrespondents (Nulty, 2008). With this in mind, we make every effort to secure a high response rate when we use a survey for data collection. Here are seven tips we recommend for increasing online survey response rates based on our experience:

    • Add sender email address(es) to approved senders list.  A common problem when sending an online survey occurs when the survey is incorrectly flagged as unwanted spam. Emails received by survey participants may go directly to their spam mailbox or may “bounce” if the sender’s email address is not on an approved list of senders. To proactively ensure this won’t happen to you, contact participants prior to survey administration to request they add any necessary emails to their approved list as well as inform them of the upcoming survey invitation. If issues persist, email the participant using a different email address that has fewer restrictions.
    • Send a pre-survey letter or notification. Prior to sending the actual survey, have a respected authority figure (such as a school principal, or district-level administrator), send a survey notification letter to participants. This letter should inform participants why the survey is important to the school or district, which will help contextualize the survey in the local information needs of the participants. Additionally, it will inform the participants that a survey is coming from an independent evaluator, so they can keep an eye out for the survey invitation.
    • Use an engaging and informative survey invitation. When sending a survey via email, the initial survey invitation is your first chance to grab the attention of your participants. First, use a short, engaging email title to ensure your invitation is not buried in their inbox. Next, the body of the email should provide all relevant information that participants will need to know about the survey, including the following: (a) a clear survey purpose and why participant feedback is important, (b) an accurate estimate of the time needed to complete the survey, (c) a date by when the survey should be completed, (d) information on incentives to complete the survey (if applicable), (e) a statement about survey confidentiality and/or anonymity, (f) and a contact name and email address for any questions about the survey.
    • Ensure your survey is easy to complete.  As you are creating your survey, keep certain readability factors in mind: Is the survey clear and easy to read? Is it free of jargon? Is it too long and time-consuming? You don’t want survey participants to lose interest in completing a survey because it’s difficult to read or too lengthy. One suggestion for improving survey quality is confirming that survey questions are as straightforward and simple as possible, as complex questions can carry a high cognitive load. Another recommendation to make the survey easier for participants is to ensure that survey questions are spread across several pages rather than one large survey all on one page. To do this, one can break pages into segments of questions that relate to one another. Finally, if possible, ensure your survey is optimized for both computers and mobile devices such as cell phones and tablets.
    • Ensure participants have protected time to complete the survey. When we send surveys to teachers, we often ask principals to give teachers time during scheduled school events, such as meetings, to complete the survey. This way, teachers aren’t left to complete the survey on their own time outside of the school day.
    • Send reminders and follow up with nonresponders.  Track survey responses closely and monitor response rates on a daily or weekly basis. This is especially important if survey responses are time sensitive such as when the survey measures change before and after an intervention. It is essential to follow up with nonresponders by sending at least two reminder emails. In some cases, you can contact the participants via phone or mail if you have that information. As a last effort, you may also enlist the help of the respected authority figure mentioned in tip two.
    • Show appreciation for time and effort.  With everyone’s busy schedules, it is important to remind participants how valuable their feedback is to the overall goals of the evaluation. To show appreciation for your participants’ time and effort, offer any assistance (e.g., emailing a paper copy) to make survey completion less burdensome for them. Incentives can be used to show appreciation for completing the survey at both the teacher level (for example, a chance to win an Amazon gift card, or using donorschoose.org to fund classroom supplies), and the school level (for example, a gift card to put on a pizza party or a contribution to a school fundraiser). Be sure to check with the school or district to learn whether there are any restrictions on teachers or administrators receiving incentives. Finally, always thank your survey participants for their time and efforts toward completing the survey.

References:

Nulty, D. (2008). The adequacy of response rates to online and paper surveys: what can be done? Assessment & evaluation in higher education, 33(3), 301–314.

An Evaluator’s Guide to Performing Successful Site Visit Observations

For external evaluators, site visits (e.g. visiting schools) provide an opportunity to experience the activity, program, or product first-hand while also offering a chance to connect in-person with participants from a study. As these observations provide an important addition to report findings, Magnolia Consulting follows several steps for successfully navigating site visit observations. Based on our experience, here are five key guidelines for successful site visit observations:

    • Conduct an in-person study orientation. Conducting an in-person study orientation before site visits offers the opportunity to meet and interact with participants, which helps to establish a trusting relationship. Having a positive rapport with participants is important as it allows for open communication and can help reduce any participant concerns associated with site visit observations. During the study orientation, inform participants of details about the purpose of the site visits, expectations for participation, and how site visits will be scheduled to minimize disruption in the classroom. During the orientation, explain how the data will be used and how evaluators will maintain participant confidentiality. If the budget does not allow for an in-person orientation, webinars with video access are also a helpful tool to introduce yourself to the participants.
    • Create an observation protocol. Using an observation checklist or protocol helps to reduce bias associated with observations and assures that preestablished guidelines are followed. These protocols typically focus on the quality and extent to which an activity, program, or product is implemented and/or aligned to best instructional practices in the field (i.e., reading instruction, STEM). If multiple evaluators perform observations across participants or sites, the measure should be checked across observers for agreement and accuracy.
    • Enlist a site coordinator. If budget allows, plan for having an on-site coordinator who knows the site’s participants and inner workings. This individual can communicate with the evaluation team’s program coordinator on details throughout the study such as scheduling observations across multiple participants at the site. Furthermore, the site coordinator can communicate details with participants before the observation and answer any questions that arise. In addition to coordinating site visit observations, the site coordinator might be responsible for other helpful tasks such as managing consent forms and ensuring assessments are distributed and returned in an organized manner.
    • Be flexible with scheduling. Allowing flexibility and accommodating a site’s scheduling needs supports understanding and recognition of “real world” complications. If possible, create a schedule that follows the site’s established routines. This may require conducting observations across several days rather than consolidating multiple sessions into a shorter timeframe. If the evaluation budget is limited and requires observations to be performed within a brief period, work with the site coordinator to determine another mindful, yet suitable schedule. If appropriate, send participants a direct email one week prior to the visit restating the purpose of the observation and confirming the schedule. This can help avoid an observation from being unannounced.
    • Follow-up with sites. Email site coordinators and participants a day or two after conducting site visit observations to express gratitude for their time and willingness to be observed. This follow-up also allows for any questions pertaining to the site visit or study.

Essential Tools for Recruiting Sites for Studies: Finding the Needle in a Haystack


At Magnolia Consulting, one of our specialities is designing and implementing curriculum efficacy or effectiveness studies, and we are often tasked with recruiting sites (i.e., schools or school districts) to participate. Finding potential sites that both fit the requirements for a given study and are able to participate can feel like finding a needle in a haystack. Based on our experience, it is possible to find these sites, but it can be challenging and requires a well-thought-out plan of action. We have found that an organized, collaborative, and personal approach to recruitment is fundamental to success. The following list includes several key elements for effectively navigating the recruitment process:

  • Start early! Allow ample time for the recruitment process, as it can be quite time consuming to identify sites and to fully bring them onboard. In terms of study implementation, it is easier to confirm sites early rather than at the last minute. If possible, start recruitment in early spring before testing or before summer break when contacts may be out of the office.
  • Create clear study documents. As each study is unique in terms of site selection criteria and benefits to participating sites, it is important to develop clear study descriptions for potential sites. Consider visually appealing ways to present information (e.g., a one-page infographic or handout about the study), as well as various methods of dissemination to a wide audience (e.g., website links or mailing lists). Ensure team members within your company review these documents and are able to clearly explain the study details to potential sites.
  • Develop a list of potential applicants. Before contacting sites, consider developing a tool to track potential sites, which may include details on site demographics and student enrollment information. Sources for this list may include information from a national database (e.g., National Center for Education Statistics) or a curriculum provider’s mailing list of users and may be limited by specific requirements of the study, such as certain areas of the country, size or locale (urban, suburban, rural) of the site, use of specific programs, or access to technology.
  • Create email and phone protocols. Utilizing email and phone communication protocols for initial contacts and any follow-up communications provides an outline for professional, consistent messaging across multiple interactions and staff members. Being approachable, positive, and grateful in all correspondence sets the stage for a potential longer-term connection.
  • Track all efforts. Tracking every interaction with potential sites is essential. For example, in an Excel spreadsheet or Google Sheet, it is possible to track all dates and methods of contact, name/phone/email of the contact, site selection criteria met, and key points from the communication, such as next steps. Tracking efforts streamlines the process, promotes greater understanding of recruitment efforts among team members, and supports the study team in making the final decision on which sites to include in the study.
  • Follow up! Consistent, timely follow-up with contacts in a way that balances persistence with consideration of busy school and district schedules is key. It is also generally good practice to communicate whether a site is selected to participate in the study and to show appreciation for time invested.
  • Confirm final sites. As study sites are selected, continue to communicate regularly with them regarding next steps. For example, ensure that all district and school approval processes are followed and request that sites sign a Memorandum of Understanding (MOU), which outlines the roles and responsibilities of all study parties. Once MOUs are signed, move forward with next steps regarding various study start-up tasks.

5 Tips to Streamline Your Data Cleaning Process

Data cleaning is possibly the most critical step in running statistical analyses. A general rule of thumb is to spend 80% of your time cleaning data and the remaining 20% on data analyses. It is important to carefully clean your data because it takes only one error to impact the results of your data analyses. At Magnolia Consulting, driven by our values of integrity, excellence, and utilization of results, we have developed processes to ensure that we provide our clients with valid and reliable findings.

Based on our experience, here are some key tips for effective and consistent data cleaning:

1. Create a checklist. We recommend creating a data cleaning checklist for two reasons. First, creating and following a checklist ensures that you have taken all necessary steps in the process. Without a checklist, it can be easy to accidentally skip a step or overlook an error. Second, different people may have alternative approaches to data cleaning. A checklist can streamline the data cleaning process and ensure consistency across different team members. At Magnolia Consulting, having a checklist has helped us to align data cleaning approaches, making it easier to address issues and to check each other’s work.

2. Check your data early. Provide yourself plenty of time to explore the data and identify questions. By checking the data early, you will improve your chances of obtaining any missing data or clarifying inconsistencies. Sometimes you cannot avoid receiving data late, but at least you have given yourself ample time to identify all potential issues rather than letting crucial ones go unnoticed.

3. Take your time. Take time to fully understand the context of your data. This includes knowing what to expect before you receive the data. For example, will you be looking at student assessment data, student demographic data, pre- and post-test data, or something else? Understanding the context will make it easier for you to understand and identify any inconsistencies, such as duplicate cases.

4. Consult with others. While cleaning data, you are often forced into “playing detective.” Before making a judgement call, identify all the information that you know so that you can have a fruitful conversation with your team. These conversations will help to acquaint other team members with the data should they be involved in the analysis or reporting phases.

5. Keep a thorough record. Data cleaning can involve a significant amount of changes and decisions, making it difficult to remember everything you did. Remembering the smallest action might be important in answering questions later when you need to revisit previous decisions. Ultimately, creating a detailed data record will save you time and spare you frustration. It will also allow you to replicate data cleaning processes in the future. Versioning is also useful—saving every version of your database makes the process more efficient. If you make a mistake, instead of starting over, you can easily return to a previous version.

Magnolia 2018 Staff Retreat

For this year’s annual staff retreat, the Magnolia Consulting team gathered at a beautiful lake house nestled along Lake Anna in Central Virginia. The theme of this year’s retreat was “Work Smarter, Not Harder,” with the intention of coming together to refine Magnolia’s mission, values, and goals as we continue to move forward in cultivating learning and positive change.

Our retreat involved a great deal of strategic and tactical planning. An external consultant helped guide our team through a series of sessions that involved revisiting our company mission and vision, identifying our personal and company core values, and from this, developing clear goals to work toward. Through stimulating discussion among team members, we worked our way through “SOAR,” which stands for Strengths, Opportunities, Aspirations, and Results. Our team found this strengths-based strategic planning approach to be extremely insightful and productive. To think about how we can work in alignment with our strengths and aspirations we discussed current and possible new lines of work. It was an engaging dialogue that concluded in a collective agreement to build our expertise, connections, and focus in community college and workforce development, girls and woman’s education, infographic design for evaluation and research, and capacity building for our clients.

Through our core values of Abundance, Cultivation, Excellence, Integrity, Utilization-Focused and Results-Oriented, Heart-Centered, and Service, we look forward to increasing our own capacity to continue our mission of providing innovative and customized evaluation and research services that improve individual and organizational capacity for positive change.

 

What Adult Learners Need: In Support of Community Colleges

I recently came across an NPR article entitled “What Adult Learners Really Need (Hint: It’s Not Just Job Skills)” by Anya Kamenetz (2018). In the article, the author interviews David Scobey, PhD, who argues against the idea of 2-year degrees, stackable credits, and short-term workforce credential programs. Instead, he suggests that more than 70% of community college students want to get a bachelor’s degree; highlights the importance of a broad, liberal arts education; and suggests that many of the jobs accepting students with third-party credentials will be gone in the next decade.

In reflecting on this article, and drawing on our work with community colleges on Department of Labor grants, including multiple focus groups with a wide variety of community college staff, students, and employers, we believe that several other points should be considered:

  • Community college programs can be responsive to regional workforce needs by having honest conversations with area employers, who can be partners in training and education. When community colleges work with local workforce advisory boards on curriculum development, regional needs, and program implementation, students and employers both benefit. Employers have shared that they want students with basic workforce training and soft skills, and community colleges have partnered with employers to provide students with a foundation for success. Once students receive their credentials or certifications and are employed, the partner employers have stated that they will work with their new employees to advance their training and education. As a result, completing a short-duration program oftentimes does not signify the end of a student’s education or training, but could be just the beginning.
  • Community colleges can provide additional “vertical supports” such as training in soft skills, Microsoft Office, and career readiness, as well as tutoring and coaching. Within these supports, students take courses together and create peer learning communities and cohorts that encourage students and motivate them to persevere (similar to “horizontal supports”). In focus groups, students regularly spoke to the benefits of these supports, commenting on staff and instructors who would do anything to see students succeed and a motivating and supportive “family” of peers in their programs.
  • Remedial education experiences differ, with remediation acting as a potential barrier to continued education. Consider that 91% of 2-year institutions have an open admissions policy, compared to 27% of 4-year institutions (U.S. Department of Education, 2017). As a result, the needs of remedial education students at the community college level are likely very different from the needs of those at the university level. In community colleges, adult learners who pursue additional education after years away can face anxiety as they find themselves in need of remedial math and English courses, and they may be at greater risk of dropping out (Pruett & Absher, 2015). Community college students may also be older and have poorer college adjustment, more difficulty with finances and with accessing college services, and more difficulty with transportation (David et al., 2015; Simmons, 1995). Because of these differences, these students may not have the same opportunities (or desires) to pursue a 4-year degree.
  • Remedial courses can be effective in community colleges if they are contextualized and supportive. We have seen that remedial education at the community college level can be very effective at increasing students’ knowledge, self-confidence, and career self-efficacy if such instruction is included within an overall program model of contextualized support. At capstone presentations at the end of their 6-month programs, we have heard community college students speak to the positive impact that these short-duration programs and remedial courses had on their lives. Many students did not realize that they were “good students” until they participated in these courses.
  • A broad liberal arts curriculum is not for everyone. Many community college students that we have talked to mentioned the benefits of completing an applied program in their chosen field. These students specifically noted that they were not interested in a broader or more expansive liberal arts curriculum.
  • Adult learners should have the option of a shorter-duration program. Students enter these programs for several reasons, such as (a) a belief that their bachelor’s or master’s degree is not helping them to find a job; (b) an eagerness to get back into the workforce after being laid off; or (c) a determination to limit any personal or family financial struggles associated with taking time off from work.
  • Community colleges can support students in going farther than they thought possible. We also heard from community college students who were so motivated and engaged by their supportive, contextualized experiences in 6-month programs that they continued on to other 6-month programs, and some went on to attain an associate degree. Several students who furthered their education shared that they were initially not expecting to seek anything beyond a 6-month certificate, noting that they previously lacked the academic self-efficacy to consider anything else.
  • Short-duration programs can be successful in training students and preparing them for the future, and more models should be shared. It may be that more needs to be shared around successful program models at the community college level. We intend to be part of that discussion. If you are interested in learning more, please see our presentation on a successful cohort-based program model for recruiting, retaining, and employing advanced manufacturing students at the High Impact Technology Exchange Conference (HI-TEC) in July 2018 or contact us for more information.

 

References:

David, K. M., Lee, M. E., Bruce, J. D., Coppedge, B. R., Dickens, D., Friske, J., . . . Thorman, J. (2015). Barriers to success predict fall-to-fall persistence and overall GPA among community college students. Journal of Applied Research in the Community College, 21(1), 5–13.

Pruett, P. S., & Absher, B. (2015). Factors influencing retention of developmental education students in community colleges. Delta Kappa Gamma Bulletin, 81(4), 32–40.

Simmons, D. L. (1995). Retraining dislocated workers in the community college: Identifying factors for persistence. Community College Review, 23(2), 47–58.

U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS), Fall 2000 and Fall 2010 Institutional Characteristics component and Winter 2015–16 and 2016–17 Admissions component. Table retrieved from https://nces.ed.gov/programs/digest/d17/tables/dt17_305.30.asp?current=yes

 

Eight Steps to Conducting an Evidence Search

The Every Student Succeeds Act (ESSA) promotes the use of federal funds to purchase programs that have evidence of effectiveness in increasing student success. But how can state and local education leaders find programs and practices that meet ESSA evidence standards? As part of a partnership between Regional Educational Laboratory (REL) Central at Marzano Research Associates and the Nebraska Department of Education, we provided technical support in developing a systematic process for conducting evidence searches. If you are on a quest for rigorous evidence to support a program or practice, here are eight steps you can follow to conduct a search.

  • Step 1: Define constructs. To start, it is important to fully understand what you are looking for! Therefore, the first step of the process is to thoroughly define the program or practice that is the subject of your search, including its expected outcomes.
  •  Step 2: Identify search databases. The Internet is bursting with search engines! However, some will be more relevant to the search than others. In this step, take some time to determine which search engines and databases will be most appropriate to this specific search. If the search is in the realm of education, websites and search databases such as What Works Clearinghouse (WWC), REL publications, and Evidence for ESSA are good places to start because the studies on these sites have already been reviewed against rigorous standards.
  • Step 3: Determine inclusion criteria. There is so much literature out there! Therefore, it is necessary to set criteria for including a study to review. Good criteria will help you focus weed out studies that aren’t relevant to your search and focus your attention on those with the highest potential to meet evidence standards. Criteria might include study design characteristics, such as a randomized controlled trial or quasi-experimental design, or program characteristics, such as relevant outcomes.
  • Step 4: Determine search terms. Before beginning the search, determine and record the search terms to be used in the search process. Be sure to consider synonyms and alternative terms for the constructs you defined in Step 1.
  • Step 5: Prepare the search database. Before you begin, you’ll need to prepare a database to document your search results and describe studies that meet inclusion criteria. We use an Excel spreadsheet with column headers to capture study design characteristics that speak to the study’s rigor and program characteristics that help define the program and explain why it’s effective and how it can be replicated. Include additional column headers for any information relevant to your specific search purposes.
  • Step 6: Conduct the search. Now you are ready to conduct the search! Enter the search terms identified in step 4 into the search engines and databases identified in Step 2 and review the results. If a study seems relevant, review the abstract or the entire study more carefully to determine if it meets your inclusion criteria. As you conduct the search, document the number of initial results, number of abstracts reviewed, and number of studies identified for further consideration in the spreadsheet or other tool prepared in Step 5. It is important to track these numbers to capture historical data regarding your search as well as representation of the breadth and depth of your search process.
  • Step 7: Document studies. Yay, a study meets your inclusions criteria! Document those studies that fit your criteria in the spreadsheet you prepared in step 5. This creates a record of these studies, so they can be further examined and reviewed.
  • Step 8: Review the search process. Throughout the search process, it is important to periodically evaluate the findings and the process in general to ensure the review is on track and identifying the most relevant studies. Review the studies you’ve captured. Are you finding studies that align with your search goals? If not, revisit the search design (steps 1–5) and revise steps as needed. Also review the results of the search and determine whether the studies you’ve documented are providing strong supporting evidence for your program or practice. It is important to gauge this regularly so you know if you are on track.

These eight easy steps will help you organize any evidence search so you find the most relevant studies to support your program or practice in a systematic and efficient manner. For more information on meeting ESSA requirements for evidence-based programs and practices, check out WestEd’s resources for states here.