circular graph describing the addie process

According to our Annual Guidelines for Extension Faculty Promotion (PDF), an extension program is:

A coordinated set of learning experiences designed to achieve predetermined outcomes. Program development follows a continuum starting with initial environmental scanning and identification of a need or gap; determining programming priorities; and the development of learning experiences and application activities that culminate in changes in knowledge, behavior, skills and attitudes. These changes manifest as measurable program outcomes.

Historically, extension program development has taken many forms and used a multitude of terms to describe a systematic and structured process to ensure effective use of organizational resources to meet identified needs. Adopting and implementing our extension program development model will provide you with a pathway to consistently develop tools, resources and learning experiences that will support behavior change and impact.


MU Extension’s ADDIE model of program development (Analyze, Design, Develop, Implement and Evaluate) is not new. We present it as a more focused, streamlined approach to target identified program priorities, to increase efficient use of organization resources, and to increase program impacts on Missourians.

Download this quick reference sheet to guide you through the five steps of program development.

ADDIE Quick Reference (PDF)

Graphic summarizing each step of ADDIE model of program development
Select a topic above to sort by content name, or use Ctrl (⌘) + F on your keyboard to search the entire page.


Exploration and Conceptualization

The first step in the program development process uses exploration and conceptualization processes to identify the economic, environmental and societal issues that extension programs can and should address.

Program analysis uses a variety of tools and techniques, including:

  • Needs assessments with engaged stakeholders:
    • Extension councils and other advisory boards.
    • City, county, state and national legislative bodies.
    • Community groups, commodity groups, partners (within MU and external), funding agencies, etc.
  • Situational analysis and environmental scanning using secondary state and national databases and literature reviews.
  • Market analysis to identify
    • Potential audience(s) that will participate in the program.
    • Potential revenue streams, as well as the likelihood that the audience or another funding source will be able to financially support the program.
    • Potential new partnerships and resources within the university, our external community partners, investors and stakeholders.
    • Potential competitors.
  • Resource analysis to identify
    • The research base or evidence base for the content that will be delivered through the educational program.
    • Current curriculum/publications/resource material/educational experiences available in-house, through partnerships or through purchase to support this program’s development or implementation.
    • The financial implications related to designing, developing, implementing and evaluating this program.
      • Total costs including faculty and support unit time, infrastructure, education resource development, ISE training costs, evaluation strategies, etc.
      • Revenue streams (grants, contracts, partnerships, core/cost funding, fees, etc.) to fund the program.
    • The intellectual property rights, i.e., who owns the content.
    • Priority ranking of the program fits in the priority list.

Extension administration, directors, faculty, staff and councils must work through an objective process to clearly define program priorities at all levels in the organization. It will be important that everyone involved continually evaluate each phase of the process and that adjustments are made to ensure optimal effectiveness and program impact.

The use of a standard program development process should result in a clear situation statement with a defined target audience; identified economic, environmental and/or societal issues; desired changes and/or outcomes for stakeholders and audience and all be clearly aligned with MU Extension mission, vision, and priorities. Once all of these items are recognized, it will then be important for all involved to determine if the need is adequate to commit the necessary financial resources.

For more information and resources for the analysis phase, see Program Analysis


The second step in the program development process is to design the educational resources, marketing support and evaluation tools.

  • Engaging stakeholders and audience in the design process.
  • Defining clear learner objectives.
  • Identifying evaluation strategies to ensure learning experiences meet the learner objectives and the overall program outcomes.
  • Identifying appropriate delivery methods, which may include:
    • Face-to-face workshops, demonstrations and field days;
    • Articles containing learning content;
    • Guides and publications;
    • Videos (Learn now or How-to);
    • Courses (synchronous/asynchronous/blended online);
    • Webinars;
    • Audio and podcasts; and
    • Tools and apps.
  • Ensuring delivery methods will engage target audiences.
  • Working with communication team to design a marketing plan for the identified audience(s).
  • Creating a financial plan detailing how the program will be supported financially throughout its lifecycle.
    • Ensure that financial plan is reviewed and approved by extension fiscal office and those responsible for providing funding.
    • Include the commitment of faculty and staff resources.


Development of the content and educator and learner resources builds the educational foundation for the program.

These resources follow the components in the design stage.

  • Engage potential partners and stakeholders in the content development process.
  • Inventory available resources to ensure duplication is avoided.
    • Items to include in an inventory: publications, manuals, online resources, apps, how-to videos, curricula, marketing materials, and evaluation instruments.
  • Determine additional resources that will need to be developed or purchased.
  • Develop an evaluation plan to determine program outcomes and how to improve participants’ learning experience.
  • Identify program registration and client management processes.
  • Pilot prototype processes to be used. Evaluate and modify as needed.
  • Have the modified pilot programs peer reviewed.
  • Have program content and delivery reviewed periodically by peers.


Implementation may include a wide variety of educational strategies identified or developed for the program.

  • Engage key partners and stakeholders to facilitate successful implementation.
  • Ensure that faculty and staff are trained to meet defined outcomes and engage the target audience during implementation of the program.
  • Train faculty and staff to use evaluation tools and report program outcomes for each program.
  • Recognize and reward faculty and staff for their efforts.


Practice Continuous Improvement

Excellence in education isn’t an end-state, but rather evolves through iterations of implementation and improvement over time. This means we continually monitor, reflect, adapt and innovate to better serve our learners.

Program Evaluation

Program evaluation should be an ongoing process throughout all phases of the ADDIE model. Program evaluation can include instructor reflections, learner feedback, peer review and impact data. Ultimately, you need to ask "Did you accomplish what you set out to do?"

Program Outcomes and Learning Experiences

All steps in the program development process should be continually monitored and evaluated to improve effectiveness and efficiencies. Evaluation strategies should be implemented at the designated steps in the process to determine the effectiveness of the learning experiences and measure the program outcomes — learning (short-term), action (intermediate) and condition (long-term) outcomes.

Program-outcome evaluation strategies may vary based upon the program. Ensure that evaluation strategies measure when participants acquire specific learning and to what extent they have applied these skills or changed their behavior. Evaluation strategies could include:

  • Pre and post surveys.
  • Student response systems (collect data during a presentation).
  • Follow-up surveys (electronic or hard copy).
  • Observations of practices (trends, data if available).
  • Testimonials.

The faculty involved in program development must determine how evaluation data will be reported to faculty, staff, participants, councils and administrators. The communication plan may include a wide variety of communication and marketing strategies as well as traditional reporting methodologies.