White men dominate the highest jobs in the United States. Read more

By BBC

Leave a Reply

Your email address will not be published. Required fields are marked *