If Observer A moves at speed $v$ respect to observer B, then there is a time dilation given by
$$t_B=t_A\gamma$$
as is known from special relativity. Here, $t_A$ is the time interval measured by A and $t_B$ is the time interval measured by B. However, a friend of mine asked me about the inverse case, what if we take the frame of reference in which A is stationary and B is moving? in that case, $$t_A=t_B\gamma$$
In this case, A's time should be dilated with respect to B's, since B is moving away, so we get conflicting results. I know that this case is somehow erroneous, but I am now sure how to justify it. I think that it has something to do with the fact that in the first equation, $t_A $ is the proper time, and that is the one that is dilated, but I have been unsuccesful at developing a convincing enough argument. How can I solve this "paradox"?