Let me answer your question by reminding the following points because the many aspects that you amalgamate in your question must be well distinguished.
(1) one should be cautious not to interpret entropy in any arbitrary reversible thermodynamic system as information entropy, or vice. We should have a reasonable and well argued. It is not mathematically and physically not justified to apply by itself and as a general rule, the Shannon information entropy $H$ or its gain or loss $dH$ to thermodynamic systems, substituting $dS$ or vice versa.
(2) One should be extremely careful not to misinterpret the Boltzmann entropy, which has a very specific and well bounded definition in terms of reversibility, linearity, boundaries and distribution form, to every thermodynamic system and process, such as an irreversible, a non-linear, or an open. The stochastic model must comply with the system.
(3) We must distinguish between what is entropy (or information) and what is the maximum entropy principle (maximum information principle), whose application again very much depends on the observable system.
(4) Particularly when we elaborate on objectivity, we must well define what in our context objectivity means, to avoid interfering with cognitive or other aspects of objectivity which for instance could be inherent to the measurement apparatus.
As far as I understand, you are trying to define objectivity in the context of information gain, and particularly in the context of maximum information gain. At the same time, however, you ping-pong thermodynamic entropy with information gain.
On an “ideal” thermodynamic system of a thermostat, we have no subjective influence because noise and interfering effects by our measurement apparatus are excluded, since the system is ideally closed (I assume your thermostat is not semi-closed) and first order in dynamics. Having this in mind, we may interpret the inherent entropy change of the system as a type of information gain or loss, dependent on the view point. The whole system tends to follow the maximum information principle under the primitive constraints given. That means we cannot gain more information than the maximum information. The latter is determined stochastically by the stochastic properties of the ensemble respective microstates. Hence, we can conclude, that the objectivity, what you are talking about is correlation to the maximum information gain, possible, and has nothing to do with the observer. The system gives us just no more information than this maximum as far as we are positioned as an observer at macroscopic level.
The maximum information principle is a straight formalism and well established, you can just pick it from Wikipedia or standard literature. Its discrete form applies Shannon information entropy, while the continuous form applies the Jane's form.