How important it is is not emphasized enough, i think.
It's not spoken about enough in the media. Instead we're given garbage like true love, etc etc. Am I bitter? no. just tired.
Health = Wealth, they said. But in order for me to eat healthy, i need the money because farm to table and non-GMO produce is a helluva lot more expensive than the usual stuff.
In order for me to get checked by a doctor, I need money to see one, only to be told I should be taking care of my self better, not working/stressing so hard about stuff, that "money isnt everything". Which ironically makes my blood pressure go up because I just want to tell the doctor "if i dont work at my job which gives me this level of stress I wouldnt be able to pay to see you".
Love/relationships you form are more important than money, they said. Sureee it is. What could go wrong when you place more value on people? right? i mean..it's not like people are predominantly only looking out for themselves, right?
Sorry. rant. point is, i agree with the OP