WWII brought the United States out of the Great Depression. All of FDRs programs were actually keeping the United States IN depression.
War is just good for the economy. 
There's a truth to that, and there's a part of me that says maybe we embrace it. In other words, instead of fighting everyone else's wars for them, we just dominate the military arms and technology market. Make it to where its understood that if you want to beat your rival tribe, you need to be using US made weapons. No more foreign aid or foreign wars, just sales. We're neutral - we sell to both sides, and we keep the best stuff for ourselves. Not so very much different that what we do now, just open, honest, and with intention, as well as fewer tax dollars leaving our borders and alot less American blood shed.