false
Catalog
CV Fellowship Director Training: Accreditation Par ...
Special Focus Video: Dealing with ACGME Surveys
Special Focus Video: Dealing with ACGME Surveys
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hello, my name is Jim Arrigui. I'm the Director of Graduate Medical Education at Rhode Island and Merriam Hospitals at Brown University in Providence, Rhode Island, and I'm a former program director in cardiology. I'd like to review today some tips for program directors on how to deal with ACGME surveys of fellows and faculty. First, let's review some basic background information about the ACGME survey. The survey is directed to all of your fellows and your core faculty. Core faculty are defined by the program director based on their engagement in the teaching and evaluation structure of the program in accordance with the new common program requirements. Note that core faculty are no longer defined based on the number of hours that they devote to the program. The survey is conducted between January and April of each academic year. The ACGME communicates the opening of the survey to programs and the program director. The program is responsible for managing notification to the fellows and faculty and sending them the proper links to the survey. The program director also must monitor compliance with the survey, which is available on the ACGME ADS website. It is important that the program director monitor compliance to ensure that the required number of fellows and faculty respond to the survey. Results of the survey are usually released in May, which allows time for the program director to use the information in the survey during the annual program review process, which typically occurs towards the end of the academic year. The survey for fellows assesses a number of important areas. First, duty hours, which includes a survey of all the major duty hour rules and includes an opportunity for the fellow to answer why they are exceeding duty hours. For example, if they have patient needs or additional educational experiences that they want to participate in. The survey assesses faculty supervision, interest in resident education, and whether or not faculty create an environment of inquiry. The evaluation section of the survey asks questions about the ability to access evaluations, satisfaction that evaluations are confidential, that fellows are given the opportunity to evaluate the program, and that they are satisfied with feedback after assignments. The educational content section asks questions about whether or not fellows are provided with goals and objectives for the assignments, how they are instructed to manage fatigue, their opportunities for scholarly activity, whether there is an appropriate balance of service obligations versus education, and whether they see patients across a variety of relevant settings. The resources section asks questions about several resources typical to an academic medical center, such as access to reference materials, electronic medical records, but importantly also asks whether fellows are provided a way to transition to care when fatigued, and whether fellows can raise concerns about their program without fear or intimidation. The section on patient safety and teamwork asks general questions about the culture as it relates to patient safety, the experience of the fellow in working on interprofessional teams, and whether or not the fellow has participated in quality improvement or patient safety related activities. Finally, the fellow is asked to provide an overall impression or evaluation of the program, which is quite important with regard to the ACGME's assessment of the program, as we will discuss in future slides. The faculty survey, which will not be the subject of in-depth discussion on this presentation, covers related areas that include faculty supervision and teaching, educational content of the program, resources, patient safety culture, teamwork, and overall satisfaction of the program. These questions are asked from a faculty perspective and generally provide additional data that is complementary to the resident survey. It is much less common for programs to have issues on the faculty survey compared to the fellow survey. But when you do see a signal of potential issues on the faculty survey, it should be considered seriously and investigated fully. Now let's move on to the 10 tips related to dealing with the ACGME survey. Tip number one, don't panic. The fellow survey is only one of the many annual data elements that the RRC uses to assess your program annually. Now it is an important element, given that there is so much information contained in the survey, and it is one of the more rigorous methods to obtain data about your program. But it is only one data element. The other data elements are listed here and have been reviewed in other presentations. It is important to note that to date, the RRC has not overreacted to survey issues. By that, I mean that should you see a negative signal on your survey, the RRC typically will allow you time to address that issue and will not immediately give you citations or do other harsh accreditation actions based on a survey alone, particularly if it is the first year of a particular negative signal. Tip number two, look at the bar graph, which is located at the top left of your survey. The bar graph is presented in each of the domains that were previously mentioned and compares your program, in blue, to national means, in yellow. These national means are specifically for cardiology or your particular specialty. Importantly, the mean in each of these domains already takes into account that each of the individual questions may be weighted differently by the RRC. Thus, the differences in weighting of various questions, for example in the evaluation domain, is accounted for in the overall mean as presented in these bar graphs. Therefore, you want to pay particular attention when your program mean on these bar graphs falls significantly below national means. There is no magic number that defines what significantly below national means should translate to, but suffice it to say that if you are far below the mean, you should probably act to investigate. Tip number three, look at trends. I'll say it again, trends. The RRC looks for downward trending scores in each of the major domains. The program represented on the top has downward trending scores in many of the major domains that are measured. This program director should be concerned and should investigate why his or her program has these low scores. This kind of a survey result may result in the RRC granting an area for improvement, for example. If survey trends in the negative direction are sustained, a citation may be granted. The program represented in the bottom panel, however, demonstrates that there was a downward trend in the survey numbers on one year with a recovery in the year following. This is exactly what the RRC wants to see, as this is evidence that the program noted that they were having issues on a survey and potentially address those issues, resulting in a blip of a downward trend for one year, but no real trend downward in any major domain. This kind of a survey result is no problem. Tip number four, the survey should be viewed as an instrument that is sensitive but not specific, much like a screening test is used medically. Thus, each individual question when analyzed may show that you are low in your score on a question such as supervision, for example. This does not necessarily mean you have a supervision problem. The signal of low scores on supervision or instruction or environment of inquiry, in this case, an example given in the faculty domain, may represent non-specific signals that the fellows have some issues with the program, but in my experience, these issues could be in areas other than what are implied in the specific question topic areas. Therefore, it is important to investigate low signals on questions in a broad sense without making any assumptions about what the real problem is. Tip number five, since I noted that the survey is sensitive but not specific, it is important to investigate poor surveys. Investigation could be done in a number of ways, but often includes meetings with the fellows and meetings and review of the survey results with the faculty. It is probably a good idea to discuss survey results at your annual program evaluation and determine what areas of improvement are necessary. If the particular issues or potential issues that are noted on the survey are quite sensitive or if you believe that you may be in a position where you may not be able to do a thorough investigation, do not hesitate to get your GME office involved. These individuals may perform a special program review and can do so independently of your own administrative structure and your own faculty and division structure, and therefore may give you information that is unbiased and perhaps more informative than what you can do on your own. Eventually, what you really need to do is make a diagnosis on the basis of the survey as to what issues exist in your program and then generate an action plan to address them. Tip number six, develop an action plan. This is done, in part, through your annual program evaluation process where you develop action plans and track them longitudinally over years. In addition to this internal plan, and in particular if your survey is quite off with regard to several domains, it is prudent, although not required, to put a brief summary of action plans that may be related to substandard survey results in ADS in the major changes and other updates section. Said another way, do not be afraid to acknowledge in this section of ADS when performing an annual update that you have noted that the survey has taken a turn for the worse and indicate what your plans are, in brief, to address them. This can give the RRC a message that you recognize that there may be an issue and that you are addressing it locally. It may prevent the RRC from giving you an area for improvement or a citation. Tip number seven, your survey will include specialty-specific questions in the last page of the survey. These questions are administered to senior fellows only, that is, those fellows who are near graduation. They represent a surrogate for clinical experience in a very general sense. They ask questions about space for teaching, whether the fellow, in the end, had a clinical experience and the breadth and depth of the specialty. These questions are important in that they give an overall sense of the fellow's satisfaction with the program towards the end of the training period and, again, are used as a very rough gauge of the overall clinical experience that the program provides. Tip number eight, recently, a well-being section was added to the survey. Recently, a well-being section was added to the survey. The questions that are covered in this section relate to assessing wellness and burnout of your fellows. These survey questions are quite new and the utility of these are emerging and will develop over time. However, if you should see that most of your fellows report a sense of unwellness or burnout, it probably warrants investigation to determine whether there was a real issue in the program. Tip number nine, although I previously mentioned the importance of comparing the bar graph means to national means and the importance of trends, there are some questions on the survey that are more highly weighted by the RRC. These certainly would be subject to change over the years ahead, so this should be considered a recommendation or the current status of the survey at present. The questions which tend to be most highly weighted are overall evaluation at the top right of the survey form. This was weighted highly based on the ACGME's analysis of the performance of the survey over the past decade. This analysis suggested that when residents overall evaluate the program as negative or very negative, there is very likely a serious issue in the program. Next question that's weighted quite highly are duty hours, in particular, the 80 hours rule. Note that with the recent change in the duty hour requirements, it is likely that the ACGME enforces the 80 hour rule with particular rigor. Questions related to supervision are important as supervision is considered key for patients. Safety and finally, the question related to whether the residents or fellows can raise concerns without fear of intimidation has always been important to the RRC. Tip number 10 pertains to programs with less than four fellows, which typically in cardiology may be sub-subspecialty programs. The first difference is that the ACGME requires 100% completion rate for several years in a row for these small programs in order to release the data to the program. Secondly, the program director must use the multi-year report feature on ADS to view reports which are presented in aggregate over a three-year period. If the program director selects the usual single-year survey report, he or she will not see any data. The multi-year report feature must be selected in ADS to view these results. The results are a bit more difficult to assess than the surveys that I have shown in previous slides, mainly because the numbers are smaller in terms of the number of fellow respondents and there is not a trend feature to the presentation of the survey at present, nor are there easy comparators to national means. However, these multi-year surveys can give useful information to the smaller programs, particularly when you compare the surveys from one year to another to identify locally whether there are questions that are trending in the negative direction. I hope that the information that I presented on dealing with ACGME surveys is helpful to you. I'd like to stress three points. First, if you think you have significant issues and are confused by interpretation of the survey and how to handle it, don't hesitate to reach out to ACGME RRC staff. Secondly, please network with your fellow program directors through the American College of Cardiology's Training Directors Council and resources. Thirdly, don't hesitate to utilize the expertise of your local GME office, which typically has experience in dealing with surveys from a variety of programs and issues. Thank you, and good luck.
Video Summary
The video provides tips for program directors on how to deal with ACGME surveys of fellows and faculty. The ACGME survey is directed to all fellows and core faculty, and it assesses various areas such as duty hours, faculty supervision, educational content, resources, patient safety, and overall program evaluation. The survey is conducted between January and April, with results usually released in May. The video emphasizes the importance of monitoring compliance with the survey and investigating any negative signals or downward trending scores. It suggests developing an action plan to address any issues identified and utilizing the expertise of the GME office and networking with other program directors for support. Overall, the video provides guidance on interpreting and managing the ACGME survey to improve program quality.
Keywords
ACGME surveys
program directors
fellows
faculty
compliance monitoring
program quality
×
Please select your language
1
English