I’m sorry if this offends anyone but I really hate when people say they are southern or country but they do not live the lifestyle. Being country is not just something you choose one day to be. You are raised this way it’s in your veins. It’s your grandparents take you out on your first fishing trip when your 8 years old. It’s your grandma sweet tea that’s better then anything. It’s your dad teaching you how to hunt & skin a deer. It’s not the boots and the looks. It’s a life you were raised on.
So if you were not raise with a southern lifestyle, your state only serves sweet tea at McDonald’s, or you just one day became country, please stop saying you are country or southern.