As I understand, entropy was first introduced as a state function of a macroscopic system, based on observations such as those of Clausius and Kelvin, pertaining to the directionality of spontaneous processes. Then Boltzmann introduced the idea of statistical entropy (I think this is referred to as Boltzmann entropy) as a measure of the number of possible microstates that give rise to a given macrostate. While I understand why the evolution of a macrostate would be such as to increase the Boltzmann entropy, based entirely statistics/probability, I have not seen any statement or explanation as to the numerical equivalence of classical entropy, $dS=\frac{dQ_{rev}}{T}$, and Boltzmann/statistical entropy $S=k_B\ln W$, if indeed there is one. It is not at all clear to me that these should be the same. Perhaps they are not, and in fact them both being referred to as 'entropy' is simply because they both fundamentally relate to the number of possible microstates?
Then there is also information/Shannon entropy, which Wikipedia mathematically relates to Boltzmann entropy (although conceptually, it is still a struggle for me to understand the connection).
Now it seems that the three types of entropy above are all regarding the same property; i.e. the statistical likelihood of a macroscopic state made of many 'particles'/'elements'.
However I have recently also come across quantum mechanical entropy, topological entropy (both in a pure mathematical sense, and in physics), and many other types of entropy whose names I no longer recall...
My question then is: what exactly is the scope of use of the term 'entropy'? In a physical/mathematical/quantum informatical context, are the system properties which are referred to as 'entropy' mathematically equivalent (for the common uses of the term at least), or are they just related through being, fundamentally, about probabilities of microstates? What about uses of the term in a broader context like those mentioned above, and in biology etc?
I understand that with any term, even if it originally defined in an entirely technical sense, it will inevitably end up being used loosely in other contexts and by people who don't understand it very well. So my question relates only to these well-established uses of the term 'entropy', but in different areas.