Edit: This answer is basically a mathematical supplement to the other two answers provided, which both make sense to me.
How do we define entropy?
- Thermodynamically: studying an ideal gas in a Carnot cycle led to the realization that the ratio of the energy absorbed isothermally at some temperature $T_1$ was equal to the ratio of energy surrendered at some temperature $T_2$ $(1)$. This eventually led to the Clausius inequality $(2)$ and ultimately a definition of entropy $(3)$.
$$\frac{\Delta Q_1}{T_1}=\frac{\Delta Q_2}{T_2} \tag{1}$$
$$\oint\frac{dQ}{T} \leq 0 \tag{2}$$
$$dS \geq \frac{dQ}{T} \tag{3}$$
- Through statistical mechanics: Imagine two separate boxes of gases which can only conduct heat, but not transfer volume or particles. We discover that the thermodynamic temperature, $\beta$, is constant for the two systems when they are allowed to reach an equilibrium $(4)$. In addition, we know the conservation of energy $(5)$ may be restated in terms of the entropy given in $(3)$, assuming a reversible system. Combining these two facts and invoking Boltzmann's postulate that the number of microstates, $\Omega$, is maximized at equilibrium, we can achieve the Boltzmann entropy $(6)$.
$$\frac{\partial \ln \Omega}{\partial U} = \beta \tag{4}$$
$$dU = -PdV +dQ=-PdV+TdS \tag{5}$$
$$\frac{\partial S /\partial U}{\partial \ln \Omega / \partial U}= \frac{1}{\beta T} = k_B$$
Upon integrating this equation with $dV=0$ and ignoring the additive constant,
$$S=k_B \ln \Omega \tag{6}$$
This answers one of your questions:
Should we not only be talking about the "change" in the entropy of a system?
If using the Boltzmann entropy equation, then its absolute value does have meaning, due to our omission of this additive constant. As pointed out in other answers, when there is only one microstate, the Boltzmann entropy is zero which clearly defines the "floor" of the entropy; the most ordered system possible corresponds to the lowest entropy.
but I don't understand what is this "order" we talk about?
In terms of the Boltzmann entropy this order is clearly defined by counting microstates, and therefore the entropy is a measure of disorder of those microstates.
Considering a different scenario, where a container of gas is in contact with a heat bath. The following probability distribution of energy levels can be derived in a variety of ways (canonical ensemble):
$$\rho_i = \exp(\beta(F-E_i)) \tag{7}$$
We discover a new entropy equation by performing the following operation (noting that $U=\langle E_i \rangle$):
$$-k_B \langle \ln \rho_i \rangle = -k_B \langle \beta (F-E_i) \rangle = -k_B \beta F + k_B \beta U = (U - F)/T \tag{8}$$
By recognizing $(8)$ as the Legendre Transform of our internal energy $(U)$, we can find a new form for entropy. It is apparent that $U \equiv U(S,V)$ from $(5)$, so while holding $V$ constant, we need to only imagine a function $F \equiv F(T,V)$ which transforms the energy function from $S \rightarrow T$,
$$U(S,V) + (-F(V,T)) = ST \tag{9}$$
Therefore, from $(8)$ and $(9)$, and noting $F$ as the Helmholtz Free Energy,
$$S = -k_B \langle \ln \rho_i \rangle = -k_B \sum_{i} \rho_i \ln \rho_i \tag{10}$$
This is the Gibbs Entropy. It can also easily be shown that this more general entropy equation reduces to $(6)$ when the probability distribution is simply $\rho_i = 1/\Omega$. There are numerous ways of validating that Gibbs entropy equation is self-consistent.
but I don't understand what is this "order" we talk about?
The problem has now been reframed in terms of the probability distribution of the underlying energy levels. In principle this can be the underlying "anything" which we are trying to determine the disorder of. This entropy therefore determines the "expectation of the logarithm of the probability function"; recalling that probabilities always fall on $[0,1]$, we know that the expectation of the logarithm of the probability will trend towards negative infinity as this expectation value drops to zero. Therefore:
- Lower average probability of a state occuring = high entropy
- Higher average probability of a state occuring = low entropy
What is this ordered state that we compare our system to and then decide that our system has more or less entropy than it?
This is the famous "arrow of time" and you can see a recent Physics SE post that has many answers discussing it, and the apparent irreversibility of systems with $N \rightarrow \infty$.
Finally, $(10)$ is related to the Shannon entropy by a factor of $k_B$, which serves as the physical factor which corrects for units in how entropy is defined thermodynamically. For non-physical systems there is nothing fundamental about this factor, and the entropy might as well be,
$$\mathcal{S} = - \sum_{i} \rho_i \ln \rho_i$$
This helps cement the ideas of entropy as it relates to probability as mentioned in my bullet points above.