I strongly disagree that it is meaningful or interesting to discuss the "ontic status" of sets, and whether you consider the category to be fundamental or the set. The only thing that physicists ever use is computation, that's what we see in nature. Quantum mechanics is a procedure that sets up a computation that matches any given situation in nature. The sets and categories are an elaborate construction that you use in intermediate stages to make statements about computation, which are the observable things.
Neither (infinite) sets nor (infinite) categories are directly observable in and of themselves, they are concepts that clarify how computation works by setting up machinery that unifies proofs. Sets are important because they teach you about the ordinals, which order arithmetic theories by strength (and these arithmetic theories are obviously linked to computation). Further sets are useful because they allow you to talk about geometry and arithmetic in the same structure. Categories are important because they tell you how to calculate complicated homology, and at the same time make extremely loose analogies in general, so that they allow you to axiomatize things that are otherwise intuitively graspable but hard to write down. Both activities are productive in mathematics.
But physics does not live in the category "SET", it lives in the category "COMPUTER PROGRAM" (and also in natural limits of potentially infinite size and running time). There are plenty of things people talk about in "SET" that cannot be physics under any circumstance, because they are uncomputable: for example, a theory where the electron swerves left or right at time t depending on the solution to the halting problem. One must formulate a theory as an in-principle computer program, not as a set-theory construction.
Also, physics certainly doesn't care about ontology or anything else that is not logically positivistically well defined.
Quantum time evolution is a category
The idea that time evolution in physics is a morphism in the category whose objects are the states is a fine and productive idea, and I heard it associated with Segal. Segal uses it to give an axiomatic formulation of what it means to be a topological field theory. In order to understand topological field theory you learn to compute, but the structure is categorical, and best expressed using the category of one-dimensional circles and lines (or two dimensional surfaces for 3d field theory) and morphisms which are bordisms between them (one-higher dimensional manifolds whose boundaries are the given manifolds at the beginning of the morphism and the end). The bordisms describe topological path-integrals that evolve in time between different manifold states.
The composition axiom is a trivial but important part of the path-integral formalism--- it says that you can glue path integrals along common boundaries to make a path integral on the larger space (this is ultimately the statement that time evolution is a composition of linear operators in quantum mechanics). But the topological aspect of the theory is exressed compactly by saying that the objects are topological 2-manifolds with boundaries (the objects are only topological). This idea is for-sure productive and really interesting, and has contributed to 3d-mathematics in the last decades.
anyway, the more general idea that one should replace 'state' with 'object' and 'time evolution' with 'morphism' is a fine thing, but it is only useful in making structural analogies between different computationally well defined procedures, to see that they are related to each other. The actual thing you deal with at the end can't be so abstract--- it has to related measurable results in meaningful ways.
This is an interesting point of view, but you further want to make a purely relational theory for probabilities at the beginning and end, so that the quantum relations can be expressed as category-like relations between outcomes. This is not so productive (I think), because the actual structure for the outcomes is probability of outcome, and the mechanism for computing this probability by summing the amplitudes for intermediate states is already very concrete, and a layer of abstraction doesn't seem to get you any new way of computing probabilities.
But it might be nice to explore what type of structures can reproduce probability theory asymptotically (the way QM does) when you put a large number things together in a decoherent way. Maybe there is another way of doing it using some alternate crazy intermediate structure, other than complex amplitudes. I don't know, it is a difficult thing to say what exactly it is precisely that you require of an amplitude calculus, and categories have in the past been useful for axiomatizing conditions you require on a calculus for it to have given properties. So I don't want to say that this is a bad idea, I just don't know what categorical properties you want in order to get a quantum-like theory which generalizes QM in a meaningful way (which isn't density matrix theory, or real/quaternionic quanutm mechanics, for example, all of which can be formulated directly without a very abstract notion of what QM is).