Applying the FAA/Airline ASAP Self ReportingProgram be applied to U.S Healthcare industry?
Title: Applying the FAA/Airline ASAP Self Reporting Program be applied to U.S Healthcare industry?
Despite of detailed inspection in the airline industry rampant incidents still happening, there are programs and systems remained unexplored. However, this situation appears to be changing. The government seems to be very enthusiastic to implement the recent program thru ASAP/FAA, Airlines and Unions public. This program flatly depends on the integrity of all parties to the process. If relevant information was withheld or not reported in timely fashion by employees, or an employee meeting the reporting criteria is subject to punitive action, the above process will quickly unravel. All parties understand that such a failure will be a major step backwards for the FAA, all AA maintenance and engineering personnel, and, most significantly, the flying public.
This paper analyzes the feasibility of applying the self reporting program of ASAP/FAA to health care industry of the United States of America. The issues addressed include all individuals are encouraged to report any event or observation they feel identifies a potential hazard/safety concern to flight operations or in Health Care Industry the health personnel must provide the exact and detailed information about any defect that was observed. Individuals are encouraged
Need essay sample on "Applying the FAA/Airline ASAP Self ReportingProgram be applied to U.S Healthcare industry?"? We will write a custom essay sample specifically for you for only $ 13.90/page
A BRIEF HISTORY OF AVIATION SELF-REPORTING PROGRAMS
In June 1970, a two-year test program was completed, that introduced the concept of a joint safety data-sharing program sponsored by the FAA and the aviation community to reduce the general aviation accident rate. The test program demonstrated that a reduction in the general aviation accident rate could be achieved.
The NASA Aviation Safety Reporting System (ASRS) was established in 1975 under a Memorandum of Agreement between the Federal Aviation Administration (FAA) and the National Aeronautics and Space Administration (NASA). FAA provides most of the program funding; NASA administers the program and sets its policies in consultation with the FAA and the aviation community. The ASRS collects, analyzes, and responds to voluntarily submitted aviation safety incident reports in order to lessen the likelihood of aviation accidents. ASRS data are used to:
· Identify deficiencies and discrepancies in the National Aviation System (NAS) so that these can be remedied by appropriate authorities.
· Support policy formulation and planning for, and improvements to, the NAS.
· Strengthen the foundation of aviation human factors safety research. This is particularly important since it is generally conceded that over two-thirds of all aviation accidents and incidents have their roots in human performance errors.
THE ASAP PROGRAM
Although the safety record for the U.S. Airline industry was considered excellent by the 1990’s, it had reached a plateau with minimal improvement. The industry and the regulators were jointly searching for ways to significantly reduce mishaps / incidents. The Aviation Safety Action Program (ASAP) program was developed as one of the solutions. ASAP was introduced with the view to encourage airline flight crew and maintenance staff to voluntarily report on safety information that might be critical in error management and identifying potential precursors to accidents. The FAA was of the view that by introducing such a program, the level of accidents could be reduced to a point that would make commercial air travel virtually accident-free.
The primary objective of ASAP is to identify normally unreported “irregular events”. These events would otherwise not be likely to come to the attention of company management, the Federal Aviation Administration (FAA) or flight crews.
Under this program, issues are resolved using corrective methods that focus on incorporating error management / human factors in processes, procedures and design, rather than through punishment and discipline. These programs are based on partnerships between the employee, company, FAA, and in some cases, the employee’s labor organization. To get an employee to report on his/her safety issue(s) would be undoubtedly difficult, but the ASAP program has enforcement-related incentives / protections to promote participation and make employees more interactive.
A panel of members generally called the Event Review Committee (ERC) is appointed to meet on a regular (weekly) basis and look at the issues in each report. The panel then suggests corrective measures to remove this.
ASAP also provides a vehicle whereby an employee of the airline or the ground maintenance staff can safely identify and report safety issues to the management and FAA without prosecution. The ASAP is signed in a Memorandum of Understanding (MOU), between FAA, the certificate holder management, and the employee’s labor organization or a representative. The program encouraged employees to disclose safety information that which may include possible violation of 14 CFR without fear of punitive enforcement sanctions and disciplinary action.
FAA encourages employees to develop programs and submit them to FAA for review and approval in accordance with the norms formulated. He is also entitled to develop a demonstration program to measure its effectiveness and ensure that it meets the stringent safety objectives of that specific ASAP. This development program is then assessed by the parties to the covenant for acceptance as a continuing program, subject to review and renewal by FAA every 2 years.
As of December, 2005, there are 33 airlines that operate ASAPs for pilots, but there are only 10 ASAP MOUs for aircraft maintenance mechanics. Of the six maintenance ASAPs, most are considered by all stake-holders to be highly effective. Because of the potential benefits to safety, a major interest of the FAA is to determine whether the failure of ASAPs to expand to multiple operators as rapidly for aircraft maintenance as it has for pilots is attributable to FAA’s ASAP policy, and/or to other factors beyond the control of the FAA.
The statuses of the reports from the programs developed by the employees are reviewed by the Event Review Committee (ERC) comprising of a designated representative, a member of FAA, the certificate holder (employee), and a member of the employee labor organization or union who draw a consensus on the qualification of the report for inclusion in the ASAP or not. For the success of the consensus, the following must be adhered to:
· The ECR must reach a consensus when deciding whether a report is accepted into the program and when deciding on corrective action recommendations arising from the event, including any regulatory administrative action.
· Recognize that specific regulator holds the statutory authority to enforce the necessary rules and regulations, it is understood that FAA retains all legal rights and responsibilities contained in Title 49 of the United States Code (49 USC), and FAA Order 2150.3A.
· When ECR becomes aware of an issue involving the medical qualification or medical certification of an employee, it must immediately inform the appropriate Regional Flight Surgeon about the issue.
There are certain guidelines for acceptance of reports under the ASAP program that would apply to certificated health care professionals.
· Participation is limited to certificate holder employees and to events happening in that capacity.
· The employee must submit the report in the stipulated time frame.
· The alleged regulatory violation must be inadvertent, and must not appear to involve an intentional disregard for safety, and finally,
· The reported event must not appear to involve criminal activity, substance abuse, controlled substance, alcohol, or intentional falsification.
If however, an employee covered under the MOU but is a non-reporting employee is found in a possible violation, he/she will be asked to submit a report within 24 hours.
Employees who fail to report an incident within the stipulated period are denied the program’s protection. Under certain circumstances they may be referred to an appropriate office within FAA for additional investigation and/or enforcement action. They can also be referred to law enforcement agencies for any criminal activities.
For employees not covered under the MOU and found in a possible violation, ERC might on a case-to-case basis allow the employee to submit a report within 24 hours of notification. The failure to respond in the given time can lead to additional investigation and/or enforcement action. It can also be referred to law enforcement agencies too.
The following criteria must be met in order for a report to be covered under ASAP:
1. The employee must submit the report in accordance with the time limits specified.
2. Any possible noncompliance with 14 CFR disclosed in the report must be inadvertent and must not appear to involve an intentional disregard for safety.
3. The reported event must not appear to involve criminal activity, substance abuse, controlled substances, alcohol, or intentional falsification.
When a possible noncompliance with 14 CFR is disclosed in a non-sole-source ASAP report and supported by sufficient evidence, it will be addressed with administrative action, such as a Warning notice or a FAA letter of correction.
Before we take a look at the role by FAA/Airline programs in the constructive development of the ailing healthcare industry in the United States, there is a point that needs a mention here. The idea to incorporate this synopsis in this text is to understand the entire logic behind applying aviation safety programs and processes.
AVIATION VOLUNTARY DISCLOSURE PROGRAMS APPLIED TO HEALTHCARE:
Dr. Robert Helmreich represents the University of Texas, Center of Excellence for Patient Safety, Research and Practice. The center’s goals are as follows:
Translate and implement error measurement methods from aviation;
Translate and implement error management and reduction methods from aviation;
Disseminate information about error measurement and reduction to a broad variety of local and national audiences; and;
Build the capacity for patient safety research by integrating existing fellowship and graduate training programs in the UT System with the activities of the Center
In a joint meeting of Aviation Safety community representatives and U.S Department of Health and Human Services, Dr. Helmreich discussed the application of the aviation error management experience to medicine. He discussed how his own research had evolved from the study of aviation to the study of medicine, and his methodology of observing individuals and teams in their natural work settings, such as the cockpit or the operating room. This experience has led him to classify errors into five categories:
1. Violating regulations (e.g., landing before the aircraft is fully stabilized);
2. Procedural errors (e.g., entering a wrong altitude into the airplane’s computer);
3. Communication (e.g., information is not provided, or misunderstood);
4. Proficiency (e.g., not knowing or being able to perform a necessary act);
5. Decision (e.g., choosing an option that unnecessarily increases risk).
Dr. Helmreich presented the overall distribution of aviation errors into these categories, and then presented a very different distribution of the subset of these errors that were judged to be “consequential,” by the fact that they led to the commission of additional errors. Violations of regulations were the most common error overall, but only 2% of these violations was consequential; most consequential errors were proficiency or decision errors. However, crews that committed a violation error were 1.5 times more likely to commit another error, and these other errors were 1.8 times more likely to be consequential.
It was noted that aviation operations tended to be very extensively proceduralized, whereas medicine was, quite often, at the other extreme. Additionally, it was noted through empirical data (discussion) that the atmosphere of trust essential for the aviation-type error reporting programs was not always found in medical environments.
Dr. Helmreich’s research noted some form of conflict in about 10% of surgical operations he has observed, but in less than 1% of flight operations he has observed. Other examples of the cultural differences between medicine and aviation were the lower percentages of physicians than pilots who acknowledged they either:
1. Perform less efficiently in critical situations when fatigued;
2. Were unable to leave personal problems behind in professional situations;
3. Were not as good at making decisions in emergencies as in routine situations.
He then compared the distribution of the 5 types of errors he had observed in the cockpit to the distribution of these errors he had observed in the operating room. He found a much higher frequency of procedural and communication errors in the operating room. He suggested that the excess of procedural errors in medicine were caused by uncoordinated introduction of new technology, and that the less efficient communication he observed might be due to greater status differences within the medical community than with most aviation professional organizations.
Dr. Stephen Small of Massachusetts General Hospital discussed the potential application of a self-reporting program for error management to other areas of medicine.
Differences were noted between the NASA ASRS and ASAP. These are that
1. ASRS is anonymous, while ASAP is confidential;
2. ASRS is a research tool, while ASAP is a legal alternative to government enforcement of regulations.
Dr. Helmreich suggested that introduction of an error management system comparable to ASAP in one area of medicine, such as blood transfusion, might facilitate its acceptance in other medical areas. He noted that high reliability systems have a leadership committed to safety as a core value, redundancy in critical areas, workers that share management’s commitment to safety, and an organization that is continually in a learning mode; he implied that an ASAP-like program might promote comparable developments in medicine.
Two relatively new health care reporting systems are modeled after ASRS: (1) the VA Patient Safety Reporting System (PSRS) and (2) Applied Strategies for Improving Patient Safety (ASIPS). The Patient Safety Reporting System (PSRS) is a learning program being jointly developed by two federal agencies: the Department of Veterans Affairs (VA) and the National Aeronautics and Space Administration (NASA).
Applied Strategies for Improving Patient Safety (ASIPS) is a demonstration project designed to collect and analyze medical error reports from clinicians and staff in 2 practice-based research networks: the Colorado Research Network (CaReNet) and the High Plains Research Network (HPRN). A major component of ASIPS is a voluntary patient safety reporting system that accepts reports of errors anonymously or confidentially. Reports are coded using a multiaxial taxonomy.
Although both systems are based on the ASRS model, they confront different legal barriers to reporting. Physicians employed by VHA hospitals are not subject to the same threat of litigation as physicians who practice in other settings. ASIPS, a Denver-based program, has modified the ASRS model to gather data on medical errors in ambulatory settings (e.g., doctors’ offices).
Unlike their colleagues in the VHA, members of the ASIPS collaborative are engaged in protecting their data from disclosure in litigation. The system designers, anticipating a legal challenge or security breach, are developing methods to ensure that serial numbers and computer identifiers cannot be used to link ASIPS reports to particular medical errors (Pace et al., 2003). Hence, the choice of confidentiality protections varies with the potential exposure to litigation or other threats.
The University of Texas, Center of Excellence for Patient Safety, Research and Practice has developed the following a web-based anonymous close call reporting system (www.utccrs.org) that is currently being used in 10 hospitals. To date, over 450 close calls have been reported via the system and hospitals are using the data to inform improvement efforts, and three Close call Alerts have been sent to participating hospitals, the FDA, and USP regarding close calls due to labeling and packaging.
The goals of this research project are to (a) encourage the quick and honest flow of information from employees to the Company, the TWU and the FAA concerning potential violations of the FARs or any maintenance condition with potential safety implications. (b) the program encourages this flow of information by freeing the employee from fear of FAA legal action or Company disciplinary action in exchange for timely and accurate reporting, and relates the use of the same in the ailing U.S healthcare industry for better results. The end result will be the quickest possible dissemination of all relevant information — a result which enhances the protection of the public.
Two basic deliverables for this proposal were as follows: (a) a web-based survey questionnaire and (b) research related to error management methodology related to the healthcare and aviation communities that would provide for successful partnering.
For example, the results of discussions with three major U.S. Airlines (American Airlines, United Airlines, Aloha Airlines) indicate that the broad issues include employee-management-FAA trust, labor-management relationship, level of knowledge about ASAP program/process, and workload involved in successful management of an ASAP program. Based on this data, collected through the management meetings and focus-group discussions, a survey questionnaire called ASAP Questionnaire (AQ) was developed. This is a 12 item Likert-type questionnaire to be administered during the second phase of this project.
The struggle for US health care industry to survived. Due to low pay, owned practices by doctors, reduced federal Medicare inpatient and outpatient reimbursements, and changes in Medicaid reimbursements are some reasons for the steady decline in the financial performance of these hospitals. Healthcare industry faces low financial budget, and this will result to hiring of non competent worker.. Leaders of the healthcare industry have been looking for solutions in the face of growing concern among many professionals that health worker fear to divulge the information and ignore the immediate problem. Unless something was done relatively quickly, health worker would try to remain silent in fear of punishment. A total of 1,727 general aviation accidents occurred during calendar year 2001, involving 1,749 aircraft. The total number of general aviation accidents in 2001 was lower than in 2000, with a 6% decrease of 110 accidents. Of the total number of accidents, 325 were fatal, resulting in a total of 562 fatalities. The number of fatal general aviation accidents in 2001 decreased 6% from calendar year 2000, and the total number of fatalities that resulted also decreased by 6%. For Health care this would lead to death of more people and shame of an already declining industry.
APPLYING AIRLINE SELF REPORTING PROGRAMS TO THE HEALTH INDUSTRY
Organizations seek to identify the factors that might cause or contribute to adverse events before these precursors result in more loss of lives.. But understanding incident precursors poses difficulties for organizations that seek to untangle those factors from the myriad of data uncovered following incidents and death. An organization attempts to learn from its history of accident, but these infrequent adverse events yield limited data from which to draw conclusions especially in the current western world airline industry. In addition, processes of detecting danger signals can be clouded by ambiguities and uncertainties and obscured by redundant layers of protection. Finally, when things go wrong, organizations may use the same data as a basis for disciplining those involved and for identifying accident precursors. But linking data collection with disciplinary enforcement inadvertently creates disincentives for the disclosure of information. Despite these difficulties, or perhaps because of them, various industries have developed alternative models for detecting and identifying accident precursors. The types of accidents or adverse events vary among industries, from airplane crashes in the aviation industry to lethal emissions in the chemical industry to patient injuries and deaths in the health care industry. These harmful events differ in their probability of occurrence; in the distribution of their negative consequences among employees, clients, and the public; in the complexity and interdependence embedded in their technologies; and in the regulatory context in which they operate (Perrow, 1984). For example, although the estimated number of deaths and injuries attributed to preventable adverse events in health care far outnumbers the average loss of life in aircraft accidents, the deaths in health care occur by millions at a time and usually without media attention. Aircraft accidents can kill many people in one disastrous, highly publicized moment. .
The following case study is used to illustrate the relevance of other types of aviation safety policies and programs in the healthcare industry. Allison P. is a charge nurse on a busy medical surgical unit. She is expecting the clinical instructor from the local university at 2 p.m. to review and discuss potential patient assignments for the nursing students the following day. Just as the university professor arrives, one of the patients on the unit develops a crisis requiring Allison’s attention. In order to expedite the student nurse assignments for the following day, Allison gives her electronic medical record access password to the instructor.
Examine the ethical dilemma
Allison made a commitment to meet with the university instructor to develop student assignments at 2 p.m. The patient emergency that developed prevented Allison from living up to that commitment. Allison had an obligation to provide patient care during the emergency and a competing obligation to the professor. She solved the dilemma of competing obligations by providing her electronic medical record access password to the university professor. By sharing her password, Allison most likely violated hospital policy related to the security of healthcare information. She may also have violated the American Nurses’ Association Code of Ethics in that nurses must judiciously protect information of a confidential nature. Since the university professor was also a nurse and had a legitimate interest in the protected healthcare information there may not be a Nurses’ Code of Ethics violation.
Thoroughly comprehend the possible alternatives available
Allison could have asked the professor to wait until the patient crisis was solved.
Allison could have delegated another staff member to assist the university professor.
Allison could have logged on to the system for the professor.
Do any rules nullify
Potential Benefit > harm
No policy violation
Patient right safeguarded
Not the best use of the professor’s time
Best: Crisis will be short
Worst: Crisis may take a long time
Patient right protected
Patient rights may take precedence
No policy violated
Other staff may be equally busy or
May not be as familiar with all patients
Assignments will be completed
May not have benefit of ‘expert’ advice
Confidentiality of record is assured
compromise student learning
rights may take precedence
Professor can begin making assignments
May still be a violation of policy regarding system access
regarding access to medical record
assignments can be completed
of access to information
compromise of records
crisis is cared for
Hypothesize ethical arguments
Determine which of the 5 approaches apply to this dilemma.
1. Identify courses of action available to us
2. Ask who will be affected by each action and what benefits/harms will be derived from each
3. Choose the action that will produce the greatest benefits and the least harm
An ethical action is one that provides the greatest good for the greatest number
Principles: Beneficence & Nonmaleficence
Right of individual to choose for her/himself (Autonomy)
Right to truth (Veracity)
Right of privacy
Right not to be injured
Right to what has been promised (Fidelity)
Does the action respect the moral rights of everyone?
Principles: Autonomy, Veracity & Fidelity
Fairness or Justice
How fair is an action?
Does it treat everyone in the same way, or does it show favoritism and discrimination?
Principles: Justice & Distributive Justice
Assumes own good is inextricably linked to good of the community
Community members are bound by pursuit of common values and goals
Ensure that the social policies, social systems, institutions and environments on which we depend are beneficial to all
Examples are affordable healthcare, effective public safety, a just legal system, and unpolluted environment
Principle: Distributive justice
Assumes there are certain ideals toward which we should strive which provide for the full development of our humanity
Virtues are attitudes or character traits that enable us to be and to act in ways that develop our highest potential
Examples are honesty, courage, compassion, generosity, fidelity, integrity, fairness, self-control, prudence
Like habits they become a characteristic of the person
The virtuous person is the ethical person
What kind of person should I be?
What will promote the development of character within myself and my community?
Principles: fidelity, veracity, beneficence, nonmaleficence, justice & distributive justice
In this case, there is a clear violation of institutional policy designed to protect the privacy and confidentiality of medical records. However, the professor had a legitimate interest in the information, and a legitimate right to the information. Allison trusted that the professor would not use the system password to obtain information outside the scope of the legitimate interest. However, Allison cannot be sure that the professor would not access inappropriate information. Further, Allison is responsible for how her access to the electronic system is used. Balancing the rights of the professor to the information with the rights of the patient to expect that the information be safeguarded and the right of the patient in crisis to expect the best possible care is the crux of the dilemma. Does the patient care obligation outweigh the obligation to the professor? Yes, probably. Allison did the right thing by caring for the patient in crisis. By giving out her system access password, Allison compromised the rights of the other patients on the unit to expect that the confidentiality and privacy would be safeguarded.
Virtue ethics suggests that individuals use power to bring about human benefit. One must consider the needs of others and the responsibility to meet those needs. Allison has to provide care, prevent harm and maintain professional relationships all at the same time.
Allison may want to effect a long-term change in hospital policy for the common good. It is reasonable to assume that this is not an isolated incident and that the problem may recur in the future. Can institutional policy be amended to include professors in the access to medical records system? As suggested in the HIPAA administrative guidelines, the professor could receive the same staff training regarding appropriate and inappropriate use of access and sign the agreement to safeguard the records. If the institution has tracking software, the access could be monitored to watch for inappropriate use.
Identify the moral principles that can be brought into play to support a conclusion as to what ought to be done ethically in this case or similar cases.
The International Council of Nurses Code of Ethics states that “The nurse holds in confidence personal information and uses judgment in sharing this information.” The Code also states, “The nurse uses judgment in relation to individual competence when accepting and delegating responsibilities.” Both of these statements apply to the current situation.
Ascertain whether the approaches generate converging or diverging conclusions about what ought to be done
From the analysis, it is clear that the best immediate solution is to delegate assisting the professor with assignments to another nurse on the unit.
Investigate, compare, and evaluate the arguments for each alternative
See table above.
Choose the alternative you would recommend
Best immediate solution is to delegate another staff member to assist the professor.
Best long-tem solution is to change the hospital policy to include access for professors as described above.
Act on your chosen alternative
Allison should delegate another staff member to assist the professor in making assignments.
Look at the ethical dilemma and examine the outcomes while reflecting on the ethical decision.
As already indicated in the alternative analyses, delegation may not be an ideal solution since the staff nurse who is assignment to assist the professor may not possess the same extensive information about all of the patients as the charge nurse. It is however the best immediate solution to the dilemma and certainly safer than compromising computer system integrity. As noted above, Allison may want to pursue a long-term solution to a potentially recurring problem by helping the professor gain legitimate access to the computer system with her own password. This way the system administrator may have the ability to track who used the system and what types of information was accessed during the use.
Safety Self-Reporting Program Awareness Survey
I. Today’s Date: ___/___/___
II. BACKGROUND INFORMATION
1. Select Employer Type
(Select all that apply):
e. Other: ____________
2. Job Title:
c. Other: __________
7. Gender: Male / Female
8. Year of birth: __________
9. City and State: _______________________
3. Years at Current employer:
4. Years in current position:
5. Years of medical experience:
10. Highest Education Level (Optional):
a. High School
b. Trade School A.S. /A.A.
c. B.S. /B.A.
d. M.D./M.S./M.A. Doctoral
6. Present Shift:
b. First Shift Swing
c. Second Shift
e. Third Shift
f. Not on Shift-duty
11. Work Location:
b. Small practice
d. Other: __________
I don’t know
Using the scale above, please circle the number that best describes your opinion.
III. Health Care Self-Reporting Program Knowledge and Behaviors
I am familiar with the Aviation Safety Action Program (ASAP) self-disclosure program.
0 1 2 3 4 5
0 1 2 3 4 5
I am familiar with the VA Patient Safety Reporting System (PSRS)
0 1 2 3 4 5
I am familiar with the University of Colorado Applied Strategies for Improving Patient Safety (ASIPS)
I have the necessary academic background to critically review the professional literature and draw my own conclusions about the validity and utility of health care self-disclosure programs.
0 1 2 3 4 5
I currently feel comfortable with my level of knowledge in the terminology, research design, and validity and reliability issues as well as in ethical issues in medical self-disclosure programs.
0 1 2 3 4 5
IV. Effectiveness of self-reporting programs for Identifying Health Care Error Precursors
The research findings published in my professional journals cover the advantages of a Medical Self-Reporting program.
0 1 2 3 4 5
I understand the concept of self-disclosure programs and how they would benefit the Health Care Industry.
0 1 2 3 4 5
0 1 2 3 4 5
Medical error management should be based on data developed from incident and event reporting processes.
V. Incident or Event reporting preferences
I generally think that the current incident reporting system is effective and sufficient.
0 1 2 3 4 5
I think that the current incident reporting system could be improved for;
a. The user (health care professional)
b. The patient
c. The safety / error management analyst
0 1 2 3 4 5
I personally hope to be involved in research process for self reporting in the future on a regular basis.
0 1 2 3 4 5
The health care professionals I have been exposed to in the field appear to place a high priority on lessons learned from incident reporting.
0 1 2 3 4 5
Federal Aviation Administration, Aviation Action Safety Program, AC 120-66B, November 15, 2002
National Committee for Quality Assurance, The State of Health Care Quality 2004 (2004).
Russell M. Rivers, Diane Swain, William R. Nixon , Using aviation safety measures to enhance patient outcomes, AORN Journal, Jan, 2003
Patients and a Substantial Increase in Health Care Costs, Oct. 7, 2003,