hhscienceblog.com
Why Your Skin Is Worse In The Winter
While most of us try our hardest to bundle up and protect ourselves from the cold winter air, we must also remember that the wintertime can have damaging effects on our skin and health. During the …