By now we all know what a selfie is. It’s almost impossible to walk down the street without seeing a teenager or group of kids taking pictures of themselves doing wacky things. Some people love it. Others consider it narcissistic. But no matter your views, there is no escaping the world of selfie-taking. And maybe that’s a good thing. In fact, some believe taking constant pictures of yourself is a way for women, in particular, to take control of who looks at them and what they see. According to some websites, the art of selfie-taking can actually be beneficial. Here are a few reasons why.