Electrifying the Adolescent Pregnancy Prevention Program Evaluation

Name of Organization: Women's Health Branch, NC Division of Public Health
 
Project Title: Electrifying the Adolescent Pregnancy Prevention Program Evaluation
 
Project Team Leads & Contact Information

Audrey Loper and Cynthia Seale-Rivera
Teen Pregnancy Prevention Initiatives (TPPI) Consultants
NC Women's Health Branch
5601 Six Forks Road
Raleigh, NC 27609
Phone: 919-707-5700
E-Mail: This email address is being protected from spambots. You need JavaScript enabled to view it. ; This email address is being protected from spambots. You need JavaScript enabled to view it.

Project Summary

In order to improve the effectiveness of the Adolescent Pregnancy Prevention Program (APPP) evaluation, the Teen Pregnancy Prevention Initiatives (TPPI) QI Team completed the following PDSAs:
  • Created APPP Manual to clarify the evaluation process and other APPP requirements
  • Developed annual APPP Orientation
  • Modified Survey Submission Form for paper/pencil surveys
  • Piloted electronic survey submission options
  • Required electronic survey submission in new Request for Applications
  • Proposed new timeline for the evaluation process and evaluation report format Our proposed future state will:
  • Reduce number of overall process steps from 36 to 13, 63.8% improvement
  • Reduce overall process time from 24 months to 18 months, or 25% improvement, enabling agencies to use reports for improvements

Background Information on the Area for Improvement

Adolescent Pregnancy Prevention Program (AP3) agencies need to receive evaluation reports sooner for improvement efforts. The state when the project began was that agencies received the reports 7-12 months after survey data are received by the Women's Health branch. The majority of surveys were submitted as hard copies that had to be sent to data entry. This manual process created the following wastes:
  • Rework from defects
  • Transportation
  • Waiting
  • Inventory
  • NVA steps

Need for the QI Initiative

How was the need for the QI Initiative determined?
The TPPI Team chose this project in order to improve the usefulness of the evaluation reports. Agencies received the reports too late to use them for program improvement. Of the proposed projects, the TPPI Team felt this like this would be a "quick win" because it was mostly an internal process and that there would be a significant benefit for APPP agencies.

Project Aim:

We aim to improve the effectiveness of the AP3's evaluation process by 1/31/14. Currently, 27 agencies are funded to implement AP3 across NC. Effectiveness entails:
  1. Increased satisfaction with the survey administration process.
  2. Reduced waste/hand-offs, decreased turnaround time and increased agency satisfaction with the data management and reporting process.
  3. Increased utilization of and satisfaction with the final evaluation report.

Using Lean and Model for Improvement methodologies, we will aim to improve the survey administration process and the final evaluation report through increased collaboration with one AP3 pilot site; and data management and reporting through review of DPH processes including purchasing and interval review.

Our specific goals include:

  • Increase satisfaction with the survey administration process by 20% by 7/31/13
  • Reduce data submission errors on manual surveys by 50% by 9/30/13
  • Reduce number of process steps by 50% by 12/30/2013
  • Decrease the turnaround time to <=6 months for 90% of first draft evaluation reports by 12/31/13
  • Increase agency satisfaction with the turnaround time of evaluation reports by 23.5% by 1/31/14
  • Increase pilot site utilization of the final evaluation report by 33% by 10/31/13.

Project Dates

Initiative Begin Date: Novemver 19, 2012
Initiative End Date: October 21, 2013
 

QI Tools/Methods Used

  • Value Stream Map
  • PDSA Worksheet,
  • Run Charts
  • Brainstorming
  • Pareto Chart
  • Surveys
  • Standard Work
  • Impact Matrix

Root Cause

  • APPP Agencies are not receiving evaluation reports soon enough for them to be useful
  • Manual (pencil/paper) participant surveys are submitted with errors
  • Agencies do not fully understand evaluation process
  • Data entry activities take a lot of time and are expensive
  • Aggregate evaluation report is cumbersome and time consuming

Implementation of the QI Initiative

The TPPI QI Team wanted to improve the effectiveness of the APPP evaluation process. Effectiveness was defined as:

  • Increase agency satisfaction with survey administration process
  • Reduce waste and turnaround time
  • Increase agency utilization of final evaluation report

The TPPI QI Team, located in the WHB in the NC DPH in Raleigh, NC, began the QI process in November 2012 and continues to work on improving the APPP evaluation process. The core team included five NC DPH employees and one APPP agency employee, while the ad hoc team included additional DPH and local agency employees. The team met weekly between November 2012 and September 2013 and attended workshops hosted by the NC Center for Public Health Quality Improvement to learn QI tools. A Kaizen event was held in March 2013. The change ideas implemented through the process included:

  • Revision of the evaluation timeline & report format
  • Revision of the Survey Submission Form
  • Creation of APPP Manual & orientation
  • Encouraged use of electronic survey submission

In addition to the core and ad hoc team members, all WHB staff members (~50) were exposed to the initiative through a bulletin board that was posted in the WHB office and through report outs during WHB staff meetings. Local agency staff members were informed of the project on an as needed basis through the PDSA cycles and will learn more about the results of the project during the upcoming Request for Applications process in fall 2013. Staff members of the NC DPH were invited to attend a Poster Session highlighting the TPPI QI Team along with other DPH teams in September 2013.

Measurable QI Outcomes

  • Increase satisfaction with the survey administration process by 20% by 7/31/13: Baseline value was 4 (somewhat satisfied), final value was 5 (extremely sastified) = 20% increase in satisfaction
  • Reduce data submission errors on manual surveys by 50% by 9/30/13:Baseline error rate was 80%, error rate on 9/30/13 was 45% = 43.8% reduction in errors
  • Reduce number of process steps by 50% by 12/30/2013. Measurement still in process.
  • Decrease the turnaround time to <=6 months for 90% of first draft evaluation reports by 12/31/13. Measurement still in process.
  • Increase agency satisfaction with the turnaround time of evaluation reports by 23.5% by 1/31/14. Measurement still in process.
  • Increase pilot site utilization of the final evaluation report by 33% by 12/31/13. Measurement still in process.

Intangible Benefits

Electronic - Agency:
  • Process steps decreased from 15 to 9, or 40% improvement
  • Eliminated survey storage in secure cabinet
  • Eliminated motion waste (to and from copier and printer)

Electronic – WHB:

  • Process steps decreased from 21 to 4, or 89.9% improvement
  • Eliminated Purchasing and Vendor involvement and time, which eliminates waiting time
  • Eliminated numerous handoffs and waiting waste
  • Eliminated survey submission process & form (reduced submission errors causing rework)

  • Reduced # of process steps for agencies and WHB from 36 to 13 (electronic), or 63.8% improvement
  • Electronic survey reduced overall process time from 24 months to 18 months, or 25% improvement, enabling agencies to apply report results to improve their programs
  • APPP Orientation and Manual provide standard work to clarify program process and requirements
  • Pilot changed participant consent forms to cover 3 year period v. 1 year period
  • Increased agency satisfaction with survey administration process from 4 (somewhat satisfied) to 5 (extremely satisfied) per the satisfaction survey
  • WHB staff members verbally reported increased satisfaction with the data processing steps (decreased data errors, eliminate purchase request, aggregate report changes, etc.)
  • Anticipate that a larger pool of agencies apply for funding due to improved technology within AP3 program
  • The electronic tools (laptops, tablets, smart phones) used for survey administration can also be used in teaching program material and for other administrative tasks like populating the TPPI database, which will improve program implementation
  • Anticipate improved program implementation due to receiving evaluation report early enough to make program changes (e.g., improved fidelity, teaching methods, and staff interaction with teens) which may increase adolescent sexual health knowledge, attitude, and behavior
  • State government reputation is improved – saving time, getting with the times with electronics
  • Not having to throw out surveys with errors improves overall survey results and strength of findings and eliminates waste. Agencies that had missing data in the past would now have evaluation report
  • Eliminate storage of surveys at warehouse for WHB
  • Eliminate need for purchasing office involvement for WHB
  • Survey submission form – anticipate reduced errors, waste (time)
  • Quick Reference Guides – anticipate reduced errors, waste (time)
  • Less rework by agencies cuts down on rework by WHB staff
  • Less postage to return agency surveys with errors for rework
  • No rework by agencies if they aren’t making errors due to improved TA, orientation, manual, survey submission form, etc.
  • Scanning copies instead of producing paper copies
  • No repeated postage costs for rework
  • Improved communication and teamwork
  • More efficient use of meeting time

Areas for Improvement and Change Ideas Implemented

Improvement 1
We created an APPP Manual to clarify the evaluation process and other APPP requirements. We wanted to learn if the availability of the AP3 manual will reduce rework, error submission and wait time. There were 3 cycles. A manual that already existed for our program was used as a template and team members divided the manual into sections so no one person was working on the manual. Cycle 1- The manual was sent to our QI coach via email to read and provide feedback with track changes. We accepted recommendations, made revisions and move to cycle 2. Cycle 2 – The manual was sent via email to the pilot site and ad hoc members, currently funded programs, to read and provide feedback. Feedback was positive. Cycle 3 – The manual was emailed to additional programs sites to review prior to their attendance to the program Orientation. Feedback from program staff included: it was clear and concise; gave good guidance and it was user friendly.

Improvement 2
The team developed an annual APPP Orientation. We wanted to learn if the availability of an orientation will reduce work and provide clarity on program requirements and protocols. There were 2 cycles proposed. Cycle 1 – Three programs attended – pilot site, currently funded and has been providing programming for at least 3 funding cycles, a newly funded program and an agency that has worked with the team on another program but is new to this APPP. They all agreed the orientation was clear and concise, they found out more information about TPPI requirements and protocols and it was helpful. There was a recommendation to have break-out sessions at orientation that focused on curricula being used by programs. Cycle 2 was scheduled for October 8th & 9th but due to the government shutdown it was cancelled.

Improvement 3
The Survey Submission Form for paper/pencil surveys was modified. We wanted to know if revising the survey submission form would reduce survey submission errors. There were 3 cycles. Cycle 1 – We revised the original form with 5 updates and tested with the pilot site. From that cycle we saw fewer errors 1 out of 6 and made 3 additional updates. Cycle 2 – The form was tested with pilot site and 2 ad hoc members. There were no errors and 4 new updates. Cycle 3 – The form is sent to all programs for survey submission.

Improvement 4
Electronic survey submission – There were multiple PDSAs implemented for this change idea. We tested the different types of electronic survey submission options: computer lab; one laptop for all student to use; shared smart phones; coordinator data entry from paper/pencil survey and student issued laptops. Based on the feedback from students using the different options we created an Electronic Options Guide that highlightd positives and things to think about to help the porgram choose the best option or options for thier program. This was used in conjunction with the Survey Administration guide. Both guides were tested in 2 cycles. Cycle 1 – We tested it with the pilot and ad hoc sites. Based on their recommendations we updated the form. Cycle 2 – We tested it with a new program coordinator who stated the forms were clear and easy to read.

Lessons Learned

In this process, we learned that we had to be open to where the QI tools led. Having a diverse team with a clearly defined aim and scope dedicated to using the QI tools effectively aided in this process. When we first started out with this initiative, we thought we had a definite answer to our problem – for all agencies to submit surveys electronically. However, by working with team members representing different views along the program spectrum and not taking short cuts around using the QI tools, we found that our ideas were only superficial solutions. We discovered missing components that played a more pivotal role in successful evaluation of the programs. First, we realized that there was no standard work available for agencies, even for the seemingly smallest task, to promote accuracy and consistency. We also found that counter to our assumption, not everyone was ready and able to submit surveys electronically. Most shockingly, without asking the 5 Why’s, we would have never known the actual protocols and guidance around the evaluation reporting timeline (there were none) that seemed to be the driving force behind when final reporting was completed.



 

 

 
 
 
 
 

Programs supported by:

BlueCross BlueShield of NC FoundatoinThe Duke Endowment

Copyright © 2012 - 2020 Center for Public Health Quality
Raleigh, NC 27609 | Phone: 919.707.5012 | This email address is being protected from spambots. You need JavaScript enabled to view it.

 
Joomla Template: from JoomlaShack