It’s been several months since the last post, and in the interim we had a very successful match.  We are incredibly excited to have the opportunity to train the four individuals coming to our department.  From a personal perspective, match day is a polarizing affair – the thrill of first viewing the results and opportunity to call and welcome our new trainees, mixed with the initial concern that our program wasn’t as high on their list as they were on ours.

More disheartening is scrolling through our overall list of ranked applicants and finding those that did not match.  It would be ignorant to think anything other than luck and circumstance separated my medical school match day from theirs.

Clearly, the match is an imperfect process.  Not only are there highly qualified and deserving unmatched applicants each year, but:

  • The number of applications per applicant continues to increase, likely driving even greater metrics-based screening of applications by programs.
  • The costs to applicants (and programs) are substantial.
  • It is a time consuming process, occupying a sizable portion of the 4thyear of medical school.

The San Francisco Match recently released the 2019 summary match report.  This document demonstrates the average matched applicant submitted 75 applications this past fall.  In 2004, this number 41.  In 2009 it was 50.  Despite this significant increase in the number of applications, the competitiveness of the match (percent matching) has not changed.

We recently performed a financial analysis of the 2018 match, and found that, conservatively, the mean estimated cost to match for an ophthalmology applicant was $6,613, with an aggregate of $4,636,950 spent by all applicants.  We estimated that our department spent a total of $179,327 in direct and indirect costs over four interview days, or $3,736 per each interviewed applicant.

In the current system, applicants are incentivized to apply to as many programs as possible, while programs respond in large by limiting interview applications to candidates with pre-approved metrics and stronger objective criterion on applications.  What can be done to stop the swell and improve this?  Multiple suggestions have been brought forth recently, and I’ll comment on a few of the more common themes.

A mutually beneficial option would be to limit the number of applications an individual can submit.  Data from the 2017 and 2018 ophthalmology residency match found that the number of interviews offered did not increase beyond 40 applications.  Using this number as a cap, the application costs would decrease from $1,665 to $410 per applicant.  Of course, this would come at the cost of the SF Match and its beneficiaries, with an estimated 80% loss in revenue if no further changes were made in the tiered cost structure.  This would similarly result in an average of 176 fewer applications received and estimated 14.6 hours of time saved reviewing charts at a program level.  This approach presents several reasonable objections, reviewed in detail here.  Regardless of potential merits, leaders in ophthalmology and the ACGME both have suggested this possibility is exceedingly unlikely given each applicant’s consumer rights to apply to as many programs as financially feasible.

Another proposal is to conduct an interview match prior to the standard match process.  After applications have been submitted and reviewed, both applicants and programs would create ranks lists and utilize the same matching algorithm to fill a more limited number of interview spots.  Individual programs would be able to modify their interview limit based on competitiveness.  In this system, both parties would theoretically interview preferentially with fewer required interviews.  This proposal was initially for the surgical fellowship interview process and would likely need to first be trialed on a smaller scale (? such as ophthalmology) prior to widespread consideration.  Further, this system necessitates applicants signal interest in a program prior to the interview itself; there are multiple examples (including my own) of a relatively surprising interview invite and experience ultimately influencing an applicant’s rank list and match results.

A final proposal tested a computer model of the 2014 Otolaryngology match and found that offering applicants the opportunity to provide programs with preference leads to an increase in overall interview invitations, and allows programs the opportunity to review applications more “holistically” instead of using strict cut-off parameters.  This proposal is entirely voluntary and at the time of initial application allows the applicant to choose to reveal if a program is within their list of top programs.  Early editorials of this approach have been very favorable.

While each of these suggestions have relative advantages, any change would require governing bodies to act.  Impetus aside, the financial implications of these and any other proposed changes will be important.  It is worth noting that 2015 fees for all Electronic Residency Application Services (ERAS) applications was $72 million, representing approximately 40% of the Association of American Medical Colleges operating revenue for that year.  It may be naïve to hope any substantial changes favor the pockets of the students.

Lastly, one seemingly universal need in this process is increased transparency.  Programs should divulge internal metrics utilized to screen applicants and additional pertinent information to allow applicants sufficient information to make educations on where to selectively apply.  There has been movement within the Association of University Professors of Ophthalmology (AUPO), the voice of academic ophthalmology, to create standard statistics and disclosure for all of our programs that would be readily available to applicants.  While still in the development phase, this is one change that appears quite likely in the near future.  I hope we start to see more.