Tipping is a norm in the U.S. But it hasn't always been this way. A legacy of slavery and racism, tipping took off in the post-Civil War era. The case against tipping had momentum in the early 1900's, yet what began as a movement to end an exploitative practice just ended up continuing it.Learn more about sponsor message choices: podcastchoices.com/adchoices)NPR Privacy Policy)