Kind, peaceful, tolerant, non violent, helpful and many such terms are what pop up in our minds after hearing the word "moral". The West is shown as a symbol of both material and moral progress. How true is that?
And how must we identify which act is moral and which isn't? More importantly, do nations revive based on morals or are morals the outcome of something else? And how should we view the "moral decline" in our societies?