Corporate Social Responsibility in America
Corporate Social Responsibility (CSR) plays an important role in corporate life in the
United States (US) today. It is not enough for large corporations to maximize profits but
to act in a way that is ethically and socially responsible. Citizens of the United States,
and in developed countries around the world, expect conglomerates operating in America
to carry out their business operations in an ethically and socially responsible manner.