It doesn't have to unrealistic that the United States felt that fission would cover their economic needs and didn't feel the need to build large scale fusion technology, which they probably incorporated for their power armor because of the military usage.
My take is that they had a large fission infrastructure that was relying on a dwindling supply of uranium, and the microfusion technology came too late to fix the world. Even if the technology couldn't be made larger than a backpacksized 60 kW unit, those could be potentially produced en masses and put together. But it came too late, and the rest of the world didn't have the technology, so the war for resources wasn't going to end.
Also, I imagine that somehow they couldn't figure out large scale fusion, so grid scale power supply wasn't economical yet. This would tie in with Fallout 4, where the Institute had a large scale fusion reactor prototype. If that technology had been ready by the 2050s, things could have been different, I guess.
Based on the real world principles of cold fusion, I always assumed that the microfusion technology was basically cold fusion as we know it. It's not a real thing, but experiments had been and are still being made, even though it's often a bit crank science.
Ok, so here's how fusion works: you take two lighter elements and bring the nuclei cloae enough so they form a single one, and that releases a shitload of energy, coming from nuclear binding energy. The resulting nucleus is lighter than the original nuclei, so you're turning mass into energy. Fission does that, too, but the mass defect is lower per reaction. Fission is just much easier to accomplish, because the nuclei repel each other.
So typical hot fusion just means that the fusion fuel is so hot, dense, and confined long enough that enough fusion reactions happen for net energy gain. That's hard to accomplish, because the plasma is very hot and needs to be confined by magnetic fields usually, because otherwise it would touch the reactor walls and cool down. There are other confinement methods, but magnetic confinement is the most well known besides inertial confinement. The point is that you need to bring two nuclei close together and let them have enough energy to overcome their Coulomb forces.
In the original cold fusion experiment by Fleischmann and Pons, the confinement was theorised to be a rod of palladium or platinum. The metals in this period can absorb a lot of hydrogen, and they thought that hydrogen on interlattice spots would be packed closer together than they would be normally. Throw in a current to provide energy and hydrogen into the palladium, and Heisenberg uncertainty and they thought it'd be enough to get fusion reactions within thr lattice. This would work at room temperature, basically, heating the palladium rods in the heavy water up as the fusion reactions happen.
This isn't all that high energy density, as it is limited by the low melting point of palladium, so while it'd be very cheap, it wouldn't really be easy to scale up.
However, I assume that in the world of Fallout they found a way to cram something yielding 60 kW into a backpack, so that's pretty good.
Of course, nobody on the writing team, be it classic or modern, really thought this hard about it, but nuclear fusion is kinda my thing so I like to think about it.
And as a McGuffin in the show, it's just kinda the wrong kind of handwavium. They did it kinda right in Fallout 4, but I guess they thought they had to do something different, but the handwavium doesn't quite work as well anymore.