Transmuting chemically significant quantities of one element to another using nuclear reactions is not cost effective for any naturally occurring element.
Nuclear physics is the end of alchemy.
Two examples I happen have off the top of my head: the "Fat Man" and "Little Boy" nuclear weapons deployed in the second world war each involved about $10^{24}$ fissions, and therefore produced 1–3 grams of free neutrons (depending on whether you count the neutrons that were reabsorbed in the chain reaction). That's roughly the same number of useful neutrons that will be produced at the Spallation Neutron Source (SNS) over its expected lifetime of 30 years.
Wikipedia reports that the highest neutron rate achieved by a fusor is about $10^{11}$ neutrons per second, a factor of a million smaller than the SNS. Alpha production rates would be comparable. That's roughly a microgram of helium per fusor per decade.
To address your update:
the reaction
$$
\rm {}^2H + {}^3H \to {}^4He + n
$$
releases about 17 MeV. A gigawatt power plant would produce such reactions at a rate of about $10^{20}$ per second, or about two moles of alphas and neutrons per hour. I picked a gigawatt because that's a typical size for a large commercial reactor, but this rate would be for a reactor in a fairyland where all of the reaction energy went into power production. This is also the scale of the ITER reactor, but that machine will operate in pulses of less than 1000 seconds each.