The collapse of dying massive stars results in hugely energetic supernovae (SNe), depositing enriched material into the interstellar medium and ultimately determining the evolution of the host galaxy. It is therefore crucial that we understand how massive stars evolve and die, which in turn requires accurate knowledge of the physical properties of stars just before they explode. The mass loss rates of red supergiants (RSGs) govern their evolution towards supernova and dictate the appearance of the resulting explosion. Particularly important in how stars appear in the run-up to core-collapse, and in how the explosion will appear, is the amount of mass lost through stellar winds in the RSG phase that immediately precedes SN. Specifically, there have been many recent claims in the literature that stars with masses >17Msun must experience an extended period of enhanced mass-loss before SN in which the envelope is entirely lost. To study how mass‐loss rates change with evolution, we focus on measuring the mass‐loss rates of RSGs in a sample of clusters in the local Universe. The results indicate that there is little justification for substantially increasing the mass loss rates during the RSG phase. In fact, I have shown that for the more massive RSG the mass-loss rates used in evolutionary simulations must be *decreased* by up to a factor of 20. The results show that for most RSG progenitors quiescent mass-loss is not effective at removing a significant amount of mass prior to core collapse, and hence there is no single star evolutionary pathway for the formation of Type Ibc SNe.
For information on how to participate in the teletalks, please check the SOFIA Tele-Talk page .