

You should never feel that you have to act in a certain way or become something you know you are not, just to fit into society. Stepping into the adult industry has been the best things I’ve ever done, despite so many people telling me that I’m making the “wrong” decision. I have always been an extremely sexual person & that’s undeniable. When I was younger, people used to ask me all the time “Do you have to show that much skin, do you have to post on social media like that”. I’ve tried SO many different things in terms of pursuing a “normal career” A University degree, a trade, multiple office jobs, PT courses & nothing has made me feel alive until I followed what felt natural to me & pursued the Adult industry. I do not understand the stigma towards something that is so natural to us. I know there’s a dark side of the industry but there’s a dark side to EVERY SINGLE INDUSTRY that exists. Sex is a natural human desire, why should an industry centred around sex have such a stigma attached to it? Everyone dies, but not everyone lives & this industry & what I do in it has made me feel more alive than anything I could’ve ever dreamed of.