The attitude has always been the same I think, it's just the world used to submit to us. Not anymore.
Maybe in the colonial era, but I'm talking about the post-war Britain, the one that devised the NHS, devolved the Commonwealth and had international social conscience.
All that completely gone. Everything we've done well in the last 50 years has been painted as toxic, and I can't understand why.