As is the case with most things in our rapidly devolving technological climate, it wasn't until one bored iPhone user began randomly doing things with her phone that a creepy side of tech came to light. 

A Twitter user by the name of @ellieeewbu (pronounced "ellie-weeeeeeeboowoo) just so happened to be looking at photos and noticed that searching for "brassiere" — like some sort of 1800s housemaid — reveals a "folder" with all the photos you've taken in said lingerie (that is if you've worn bras or swim tops in any photos).

The revelation was shared so many times that celebrities got involved, further adding fuel to the rapidly growing fire.

However, as Express points out, the categorizing of sexy selfies isn't as creepy as it looks on the surface. 

"This is thanks to sophisticated AI that is able to recognize objects that are common, such as a bra, to more obscure things like an abacus or a zucchini," writes Dion Dassanayake. "The artificial intelligence was trained on a library of hundreds of thousands of labelled images. And it has the ability to accurately distinguish one object it has learnt about from another."

So instead of trying out other phrases like "birthday cake" or "hat," users worldwide went on microscopic tirades about sexism and privacy and whatever else that could make for a wave of virality instead.

The short and long of it is, yes, this looks unnecessarily creepy on the surface, but the reality is likely even more subversive than that — computers can now "see" everyday objects because it's learning from the photos we're all taking. It's supposed to be more convenient for the user, but … who knows where this can all end up years from now.

According to multiple outlets that reached out for comment from Apple, it appears like the company is just going to let this one slide — which is equal parts hilarious and concerning. In the wake of the whole Weinstein debacle, brands have been taking sensitive topics far more seriously than a few weeks prior. So why Apple would just shrug its shoulders and never really provide any explanation to why "bra" and "brassiere" work but not "underwear" or "man-junk" is bizarre.

Just keep in mind we're all saying this type of intrusion is perfectly fine with every [_AGREE_] button clicked before actually reading the terms of service agreements. You're either in or you're out with tech.

Luckily, 2029 is the next time an asteroid could hit Earth. And if Jim Carrey is right — that none of this exists — what does a few creepy Apple dudes looking at your photos matter anyway?