A parameter is a number that does not "come out" of the theory, but needs to be input into the theory by hand. The electron mass is a good example -- no one can (currently) calculate the electron mass, it is simply something we measure.
There are no absolute rules about things like "how many parameters is too many parameters," though of course many people have different opinions. From the point of view of fundamental physics, typically progress is made by synthesis or unification, which is the development of a deeper theory that simultaneously explains two phenomena you originally thought were different. Typically, a synthesis of two apparently different physical theories leads to realizing that what was originally a free parameter in one of the theories can actually be calculated in the unified theory.
There are a few very famous examples of this in undergraduate physics. Here are a few examples (admittedly the last is not really an undergraduate level example)
The acceleration due to gravity on the surface of the Earth, $g$, can be calculated from the mass $M_\oplus$ and radius $R_\oplus$ of the Earth and Newton's gravitational constant, $g = G\frac{M_\oplus}{R_\oplus^2}$. In Galileo's time, gravity on the Earth was not known to be connected to astronomical quantities like the mass and radius of the Earth, but Newton showed that gravity is one force underlying apples falling from trees and moons orbiting planets.
The speed of light $c$ can be calculated from the electric and magnetic field coupling constants, which in SI units are $\epsilon_0$ and $\mu_0$. In SI units, we have $c^2 = (\mu_0 \epsilon_0)^{-1}$. This follows from Maxwell's realization that light, electricity, and magnetism -- three apparently different subjects -- are all actually one unified theory of electromagnetism.
The gas constant $R$ is related to Avogadro's number $N_A$ and the Boltzmann constant $k$, $R = k N_A$. Realizing the atomic structure of matter explained a lot of phenomena in thermodynamics by reducing them to the statistical motion of individual atoms.
The peak wavelength of light emitted from a black body at temperature $T$, was known in the 1800s to be given by Wien's displacement law, $\lambda_{\rm peak} = b/T$. The constant $b$ is known as Wien's displacement constant. With the development of quantum mechanics and the Planck distribution, it was shown that $b=h c / x k$, where $h$ is Planck's constant, $c$ is the speed of light, $k$ is Boltzmann's constant, and $x$ is a calculable order 1 factor that drops out of some annoying algebra ($x=4.965...$ solves $(x-5)e^x+5=0$).
While this is not as "basic" as the others, the unification of the electric and weak forces showed that the Fermi constant $G_F$ controlling the lifetime of the muon, is closely related to the Higgs vacuum expectation value $v$, via $v = (\sqrt{2} G_F)^{-1/2}$ (and therefore also to the mass and coupling of the W boson).
Note that these examples of unification leading to a deeper understanding of an input parameter are correlated with major developments in physics -- gravity, electromagnetism, statistical mechanics, quantum mechanics, and the electroweak theory.
There is a belief in theoretical physics that the Standard Model is not a fundamental theory (indeed we know it can't be because it doesn't contain gravity and dark matter). By extrapolating the logic of unification that has always been associated with progress in physics, one hopes that at least some of the numbers that are currently input parameters to the Standard Model, will actually be shown to be calculable outputs using a deeper theory.