by Jill Stefaniak

“Human performance technology is the study and ethical practice of improving productivity in organizations by designing and developing effective interventions that are results-oriented, comprehensive, and systemic” (Pershing, 2006, p. 6).

According to the International Society for Performance Improvement (ISPI), Human Performance Technology (HPT) is a systematic approach to improving productivity and competence, using a set of methods and procedures — and a strategy for solving problems — for realizing opportunities related to the performance of people. More specifically, it is a process of selection, analysis, design, development, implementation, and evaluation of programs to most cost-effectively influence human behavior and accomplishment. It is a systematic combination of three fundamental processes: performance analysis, cause analysis, and intervention selection, and can be applied to individuals, small groups, and large organizations.

Instructional design and human performance technology are similar in that they are rooted in general systems and behavioral psychology theoretical bases. Specifically, the International Society for Performance Improvement (ISPI) has established 10 performance standards for doing effective performance improvement design.

Standard 1: Focus on Results or Outcomes
Standard 2: Take a Systemic View
Standard 3: Add Value
Standard 4: Work in Partnership with Clients and Stakeholders
Standard 5: Determine Need or Opportunity
Standard 6: Determine the Cause
Standard 7: Design Solutions Including Implementation and Evaluation
Standard 8: Ensure Solutions’ Conformity and Feasibility
Standard 9: Implement Solutions
Standard 10: Evaluation Results and Impact
(ISPI, 2018)

HPTmodel
Performance Improvement/HPT Model

When differentiating between human performance technology (HPT) and instructional design, HPT focuses on applying systematic and systemic processes throughout a system to improve performance. Emphasis is placed on analyzing performance at multiple levels within an organization and understanding what processes are needed for the organization to work most effectively. Systemic solutions take into account how the various functions of an organization interact and align with one another. Through organizational analyses, performance technologists are able to identify gaps in performance and create systematic solutions (Burner, 2010).

While instruction may be one of the strategies created as a result of performance analysis, it is often coupled with other non-instructional strategies. Depending on an instructional designer’s role in a project or organization, they may not be heavily involved in conducting performance assessments. When given the opportunity, it is good practice to understand how performance is being assessed within the organization in order to align the instructional solutions with other solutions and strategies.

While human performance technology and instructional design have two different emphases, they do share four commonalities: (1) evidence-based practices, (2) goals, standards, and codes of ethics, (3) systemic and systematic processes, and (4) formative, summative, confirmative evaluations (Foshay, Villachica, Stepich, 2014). Table 1 provides an overview of how these four commonalities are applied in human performance technology and instructional design.

Table 1

Four commonalities shared across human performance technology and instructional design

CommonalitiesHuman Performance TechnologyInstructional Design
Evidence-based practicesOrganizational analyses are conducted to collect data from multiple sources to evaluate workplace performance.Emphasis is placed on learner assessment to ensure instruction has been successful.
Goals, standards, and Codes of EthicsISPI and ATD are two professional organizations that have created workplace standards and professional certification programs.AECT and ATD are two professional organizations that have created standards for learning and performance.
Systematic and systemic processesSystematic frameworks have been designed to conduct needs assessments and other performance analyses throughout various levels of an organization.Systematic instructional design models have been designed to guide the design of instruction for a variety of contexts.
Formative, summative, and confirmative evaluationsMultiple evaluation methods are utilized to measure workplace performance throughout the organization.Multiple assessments are conducted throughout the design phase of instruction as well as afterward to ensure the instructional solutions have been successful.

Performance Analysis

Regardless of context or industry, all instructional design projects fulfill one of three needs within organizations: (1) addressing a problem; (2) embracing quality improvement initiatives; and (3) developing new opportunities for growth (Pershing, 2006). The instructional designer must be able to validate project needs by effectively completing a performance analysis to understand the contextual factors contributing to performance problems. This allows the instructional designer to appropriately identify and design solutions that will address the need of the organization—what is often called the performance gap or opportunity.

The purpose of performance analysis is to assess the desired performance state of an organization and compare it to the actual performance state (Burner, 2010; Rummler, 2006). If any differences exist, it is important for the performance improvement consultant (who may sometimes serve as the instructional designer as well) to identify the necessary interventions to remove the gap between the desired and actual states of performance.  

Performance analysis can occur in multiple ways, focusing on the organization as a whole or one specific unit or function. The organizational analysis consists of “an examination of the components that strategic plans are made of. This phase analyzes the organization’s vision, mission, values, goals, strategies, and critical business issues” (Van Tiem et al., 2012, p. 133).  Items that are examined in close detail when conducting an organizational analysis include organizational structure, centrally controlled systems, corporate strategies, key policies, business values, and corporate culture (Tosti & Jackson, 1997). All of these can impact the sustainability of instructional design projects either positively or negatively.

An environmental analysis not only dissects individual performance and organizational performance, but it also expands to assess the impact that performance may have outside the system. Rothwell (2005) proposed a tiered environmental analysis that explores performance through four lenses: workers, work, workplace, and world. The worker level dissects the knowledge, skills, and attitudes required of the employee (or performer) to complete the tasks. It assesses the skillsets that an organization’s workforce possesses. The work lens examines the workflow and procedures; how the work moves through the organizational system. The workplace lens takes into account the organizational infrastructure that is in place to support the work and workers. Examples of items taken into consideration at this phase include checking to see if an organization’s strategic plan informs the daily work practices, the resources provided to support work functions throughout the organization, and the tools that employees are equipped with to complete their work (Van Tiem et al., 2012). World analysis expands even further to consider performance outside of the organization, in the marketplace, or in society. For example, an organization might consider the societal benefits of its products or services.

While instructional designers do not have to be experts in organizational design and performance analysis, they should be fluent in these practices to understand how various types of performance analyses may influence their work. Whether analysis is limited to individual performance, organizational performance, or environmental performance, they all seek to understand the degree to which elements within the system are interacting with one another. These analyses vary in terms of scalability and goals.  Interactions may involve elements of one subsystem of an organization or multiple subsystems (layers) within an organization. For example, an instructional design program would be considered a subsystem of a department with multiple programs or majors. The department would be another system that would fall under a college, and a university would be comprised of multiple colleges, each representing a subsystem within a larger system.

Cause Analysis

A large part of human performance technology is analyzing organizational systems and work environments to improve performance. While performance analysis helps to identify performance gaps occurring in an organization, it is important to identify the causes that are contributing to those performance gaps. The goal of cause analysis is to identify the root causes of performance gaps and identify appropriate sustainable solutions.

While conducting a cause analysis, a performance technologist will consider the severity of the problems or performance gaps, and examine what types of environmental supports are currently in place (i.e. training, and resources for employees) and the skillsets of employees (Gilbert, 1978). The performance technologist engages in troubleshooting by examining the problem from multiple viewpoints to determine what is contributing to the performance deficiencies (Chevalier, 2003).

Non-instructional Interventions

Once a performance technologist has identified the performance gaps and opportunities, they create interventions to improve performance.

“Interventions are deliberate, conscious acts that facilitate change in performance” (Van Tiem, et al., 2012, p. 195).

Interventions can be classified as either instructional or non-instructional. Table 2 provides an overview of the various types of interventions common to instructional design practice.

Table 2

Instructional and Non-instructional Interventions

Instructional InterventionsNon-Instructional Interventions
E-learning Classroom TrainingWeb-based Tutorials On-the-Job TrainingGames and SimulationsElectronic Performance Support SystemWorkplace DesignKnowledge ManagementJust-in-Time SupportCommunities of PracticeCorporate Culture ChangesProcess Re-engineering Job Aids


As mentioned in the discussion of general systems theory, it is imperative that the instructional designer is aware of how they interact with various elements within their system. In order to maintain positive interactions between these organizational elements, non-instructional interventions are often needed to create a supportive infrastructure. Considering politics within an organization and promoting an organizational culture that is valued by all departments and individuals within the system and carried out in processes and services are examples of infrastructural supports needed for an organization (or system) to be successful. While there are a variety of different strategies that may be carried out to promote stability within an organization, the non-instructional strategies most commonly seen by instructional designers include job, analysis, organizational design, communication planning, feedback systems, and knowledge management.  Table 3 provides examples of how non-instructional strategies may benefit the instructional design process.

Table 3: Non-instructional strategies

Non-Instructional StrategiesBenefit to the Instructional Design Process
Job analysisUp-to-date job descriptions with complete task analyses will provide a detailed account for performing tasks conveyed in training.
Organizational designA plan that outlines the organizational infrastructure of a company. Details are provided to demonstrate how different units interact and function with one another in the organization.
Communication planningPlans that detail how new initiatives or information is communicated to employees. Examples may include listservs, company newsletters, training announcements, performance reviews, and employee feedback.
Feedback systemsDetailed plans to provide employees feedback on their work performance. This information may be used to identify individual training needs and opportunities for promotion.
Knowledge managementInstallation of learning management systems to track learning initiatives throughout the organization.  Electronic performance support systems are used to provide just-in-time resources to employees.

Organizational design and job analysis are two non-instructional interventions that instructional designers should be especially familiar with especially if they are involved with projects that will result in large-scale changes within an organization. They should have a solid understanding of the various functions and departments within the organization and the interactions that take place among them. Organizational design involves the process of identifying the necessary organizational structure to support workflow processes and procedures (Burton, Obel, & Hakonsson, 2015). Examples include distinguishing the roles and responsibilities to be carried out by individual departments or work units, determining whether an organization will have multiple levels of management or a more decentralized approach to leadership, and how these departments work together in the larger system.

Job analyses are another area that can affect the long-term implications of instructional interventions. A job analysis is a process of dissecting the knowledge, skills, and abilities required to carry out job functions listed under a job description (Fine & Getkate, 2014). Oftentimes, a task analysis is conducted to gain a better understanding of the minute details of the job in order to identify what needs to be conveyed through training (Jonassen et al., 1999). If job analyses are outdated or have never been conducted, there is a very good chance that there will be a misalignment between the instructional materials and performance expectations, thus defeating the purpose of training.

Feedback systems are often put in place by organizations to provide employees with a frame of reference in regard to how they are performing in their respective roles (Shartel, 2012). Feedback, when given properly, can “invoke performance improvement by providing performers the necessary information to modify performance accordingly” (Ross & Stefaniak, 2018, p. 8). Gilbert’s (1978) Behavioral Engineering Model is a commonly referenced feedback analysis tool used by practitioners to assess performance and provide feedback as it captures data not only at the performer level but also at the organizational level. This helps managers and supervisors determine the degree of alignment between various elements in the organization impacting performance (Marker, 2007).

The most recognizable non-instructional interventions may be electronic performance support systems (EPSSs) and knowledge management systems. These are structures put in place to support the training and performance functions of an organization. Often times EPSSs are used as a hub to house training and support for an employee. Examples extend beyond e-learning modules to also include job aids, policies, and procedures, informative tools or applications, and other just-in-time supports that an employee may need to complete a task. Knowledge management systems serve as a repository to provide task-structuring support as well as guidance and tracking of learning activities assigned or provided to employees (Van Tiem et al., 2012).

Other examples of supportive systems could also include communities of practice and social forums where employees can seek out resources on an as-needed basis. Communities of practice are used to bring employees or individuals together who perform similar tasks or have shared common interests (Davies et al., 2017; Wenger, 2000; Wenger, McDermott, & Snyder, 2002).      When selecting an intervention, it is important to select something that is going to solve the problem or address a particular need of the organization. Gathering commitment from leadership to implement the intervention and securing buy-in from other members of the organization that the intervention will work is also very important (Rummler & Brache, 2013; Spitzer, 1992; Van Tiem et al., 2012).

Whether the intervention to improve performance is instructional or non-instructional, Spitzer (1992) identified 11 criteria for determining whether an intervention is successful:

  1. Design should be based on a comprehensive understanding of the situation. This is where previous performance and cause analyses come together.
  2. Interventions should be carefully targeted. Target the right people, in the right setting, and at the right time.
  3. An intervention should have a sponsor. A sponsor is someone who will champion the activity.
  4. Interventions should be designed with a team approach. The ability to draw upon expertise from all areas of the organization is vital to successful intervention selection.
  5. Intervention design should be cost-sensitive.
  6. Interventions should be designed on the basis of comprehensive, prioritized requirements, based on what is most important to both the individual and the organization.
  7. A variety of intervention options should be investigated because the creation of a new intervention can be costly.
  8. Interventions should be sufficiently powerful. Consider long-term versus short-term effectiveness. Use multiple strategies to effect change.
  9. Interventions should be sustainable. Thought must be given to institutionalizing the intervention over time. To really be successful, the intervention must become ingrained in the organization’s culture.
  10. Interventions should be designed with the viability of development and implementation in mind. An intervention needs human resources and organizational support.
  11. Interventions should be designed using an iterative approach. This occurs during the formative evaluation stage (discussed under the evaluation component of the HPT Model) when multiple revisions will generate interventions to fit the organization.

Forging a Relationship between Human Performance Technology and Instructional Design

While it is not necessary for instructional designers to engage in human performance technology, they may find themselves frequently in their careers working more like performance technologists than they originally supposed they would. In addition, those that use human performance technology thinking may be better positioned to design sustainable solutions in whatever their organization or system. Human performance technology offers a systems view that allows the instructional designer to consider their design decisions and actions. By recognizing the systemic implications of their actions, they may be more inclined to implement needs assessment and evaluation processes to ensure they are addressing organizational constraints while adding value. With the growing emphasis on design thinking in the field of instructional design, we, as a field, are becoming more open to learning about how other design fields can influence our practice (i.e. graphic design, architecture, and engineering), and human performance, as another design field in its own right, is one more discipline that can improve how we do our work as instructional designers.

Article Source:

Stefaniak, J. (2018). Performance Technology. In R. E. West (Ed.), Foundations of Learning and Instructional Design Technology. EdTech Books. https://edtechbooks.org/lidtfoundations/performance_technology

References

Association for Educational Communications and Technology (n.d.). Organizational training and performance. Retrieved from https://aect.org/ on August 1, 2018.

Association for Talent Development (n.d.). Retrieved from https://atd.org/ on August 1, 2018.

Bertalanffy, L. (1968). General systems theory: Foundations, development, applications. New York, NY: George Braziller.

Burner, K.J. (2010). From performance analysis to training needs assessment. In K.H. Silber, W.R. Foshay (Eds.), Handbook of improving performance in the workplace: Instructional design and training delivery (vol. 1, pp. 144-183). San Francisco: Pfeiffer.

Burton, R. M., Obel, B., & Håkonsson, D. D. (2015). Organizational design: A step-by-step approach. London: Cambridge University Press.

Chevalier, R. (2003). Updating the behavior engineering model. Performance Improvement, 42(5), 8-14.

Davies, C., Hart, A., Eryigit-Madzwamuse, S., Stubbs, C., Aumann, K., & Aranda, K. (2017). Communities of practice in community-university engagement: Supporting co-productive resilience research and practice. In J. McDonald, A. Cater-Steel (Eds.), Communities of practice: Facilitating social learning in higher education, 175-198. New York, NY: Springer.

Dick, W., Carey, L., & Carey, J.O. (2009). The systematic design of instruction (7th ed.). Upper Saddle River, NJ: Pearson.

Fine, S. A., & Getkate, M. (2014). Benchmark tasks for job analysis: A guide for functional job analysis (FJA) scales. New York, NY: Psychology Press.

Foshay, W. R., Villachica, S. W., & Stepich, D. A. (2014). Cousins but Not Twins: Instructional Design and Human Performance Technology in the Workplace. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (pp. 39–49). Springer New York. https://doi.org/10.1007/978-1-4614-3185-5_4

Gilbert, T.F. (1978). Human competence: Engineering worthy performance. New York, NY: McGraw-Hill.

Harless, J. (1973). An analysis of front-end analysis. Improving Human Performance: A Research Quarterly, 4, 229-244.

International Society for Performance Improvement (n.d.). Standards of performance. Retrieved from https://ispi.org/ on August 1, 2018.

Jonassen, D.H., Tessmer, M., Hannum, W.H. (1999). Task analysis methods for instructional design. New York, NY: Routledge.

Marker, A.. (2007). Synchronized analysis model: Linking Gilbert’s behavioral engineering model with environmental analysis models. Performance Improvement, 46(1), 26-32.

Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and Development50(3), 43-59.

Pershing, J.A. (2006). Human performance technology fundamentals. In J.A. Pershing (Ed.), Handbook of human performance technology (3rd ed., pp. 5-26). San Francisco: Pfeiffer.

Richey, R.C., Klein, J.D., & Tracey, M.W. (2011). The instructional design knowledge base: Theory, research, and practice. New York, NY: Routledge.

Rummler, G.A. (1972). Human performance problems and their solutions. Human Resource Management, 11(4), 2-10.

Ross, M., & Stefaniak, J. (2018). The use of the behavioral engineering model to examine the training and delivery of feedback. Performance Improvement, 57(8), p. 7-20.

Rothwell, W. (2005). Beyond training and development: The groundbreaking classic on human performance enhancement (2nd ed.). New York, NY: Amacom.

Rummler, G.A. (2006). The anatomy of performance: A framework for consultants.  In J.A. Pershing (Ed.), Handbook of human performance technology (3rd ed., pp. 986-1007). San Francisco, CA: Pfeiffer.

Rummler, G. A., & Brache, A. P. (2013). Improving performance: How to manage the white space on the organization chart (3rd ed.). San Francisco, CA: Jossey-Bass.

Schartel, S.A. (2012). Giving feedback—An integral part of education. Best Practice & Research Clinical Anaesthesiology, 26(1), 77-87.

Smith, P.A., & Ragan, T.L. (2005). Instructional design (3rd ed.). Hoboken, NJ: Wiley.

Spector, J. M., Merrill, D. M., Elen, J., & Bishop, M. J. (2013). Handbook of research on educational communications and technology (4th. ed.). Springer Publishing Company, Incorporated.

Spitzer, D.R. (1992). The design and development of effective interventions. In H.D. Stolovitch & E.J. Keeps (Eds.), Handbook of human performance technology (pp. 114-129). San Francisco, CA: Pfeiffer.

Surry, D. W., & Stanfield, A. K. (2008). Performance technology. In M. K. Barbour & M. Orey (Eds.), The Foundations of Instructional Technology. Retrieved from https://edtechbooks.org/-cx

Tosti, D., & Jackson, S. D. (1997). The organizational scan. Performance Improvement36(10),  2-26.

Van Tiem, D., Moseley, J.L., & Dessinger, J.C. (2012). Fundamentals of performance improvement: A guide to improving people, process, and performance (3rd ed.). San Francisco, CA: Pfeiffer.

Wenger, E. (2000). Communities of practice and social learning systems. Organization7(2), 225-246.

Wenger, E., McDermott, R. A., & Snyder, W. (2002). Cultivating communities of practice: A guide to managing knowledge. Boston, MA: Harvard Business Press.

Comments are closed