In the context of Thermodynamics and Statistical Mechanics we encounter, basically, three different definition of entropy:
First definition:
Consider an isolated macroscopic system, it has a macrostate and a microstate; the macrostate is the collection of its macroscopic properties and the microstate is the precise condition in which the constituent parts of the system are into. Then the entropy $S$ is defined as the following quantity:
$$S= k_B\ln{\Gamma} \tag{1}$$
where $k_B$ is just the Boltzmann Constant and $\Gamma$ is the number of possible microstates that are compatible with the macrostate in which the system is.
Second definition:
Same setup as before, but now the Entropy is defined as:
$$S=-k_B\sum _i p_i \ln{p_i} \tag{2}$$
where the $p_i$ is the probability of the system being in the $i$-th microstate (again, taken from the pool of compatible microstates).
Third definition:
We completely change the setup: consider a reversible cyclical thermodynamical process; then we simply define entropy as a generic function of the state variables (so as a state function) that has the following property:
$$dS=\frac{dQ}{T} \tag{3}$$
where $T$ is the temperature of the system in which the thermodynamical process is occurring and $dQ$ is the infinitesimal amount of heat being poured into the system in a reversible way.
So you can see for yourself that, at least as a first glance, the definition of entropy is quite fragmented and confusing.
Fortunately we can easly unify the the first and second definition by invoking the second fundamental postulate of Statistical Mechanics: the Principle of Indifference, it tells us that:
$$p_i=\frac{1}{\Gamma}$$
and then with a little bit of work we can easily show that the two definitions are equivalent.
But, even with this improvement, the picture remains fragmented, for mainly two distinct problems:
Fragmentation between Classical and Quantum Mechanics: How do we count the number of possible microstates $\Gamma$ in a non-quantum, classical, perspective? What I mean is: We sort of taken for granted, in the first two definitions, the fact that the number of possible compatible microstates is finite, but of course this can only be true in a quantum scenario, by hoping that the quantum phase space of position and momentum, is quantised. What if this is not the case? What if we want to statistically define entropy in a classical, non quantum scenario? How we define $\Gamma$ in the classical phase space of position and momentum? Statistical entropy can only be defined quantum mechanically?
Fragmentation between Classical Thermodynamics and Statistical Mechanics: How do we show that the first two definition are compatible with the third? This seems challenging, expecialy because the third definition appears to be really vague. This is sort of a bonus question because I founded this previous question tackling the same problem; but I think that the answer provided there is rather poorly made and excessively concise, and also I think it would be nice to complete this picture all in one question. But feel free to ignore this one if you think it would be an useless repetition of the answer I linked.