mixedamericanlife.wordpress.com
Glossary
Acting White In the United States, acting white is a pejorative term, usually applied to African Americans, which refers to a person’s perceived betrayal of their culture by assuming the social exp…