Image recognition AI certainly isn't something new. It's a feature found in the products of many tech companies, including Apple, who introduced it with iOS 10 last year. But it became a controversial topic yesterday after a twitter user noted how searching for "brassiere" in the iPhone's Photo app 'showed' that Apple places explicit user images into a folder.

While the tweet by ellieeewbu did correctly state that Apple's software can identify certain images, it does not "save" photos of bras and move them into any special folders.

The Photos app can tag and categorize a number of different scenes and objects (along with facial expressions, places, and people) that appear in images, allowing users to search for specific photos by using keywords. In a Medium post, developer Kenny Yin detailed all 4432 of these categories, including "brassiere," "bra," and "bras." But there's no mention of any type of men's clothing, strangely.

The most important thing to remember is that the object detection is all done locally on the device and nothing is sent to Apple's servers. As pointed out by TechCrunch, there are special exceptions for things such as child pornography, which have special classifiers that can reach beyond the confines of a handset.

The Tweet's reference to a folder isn't strictly accurate, either. It implies that the images in question have been moved to specific "brassier" folder the system created on the phone, but it doesn't occur this way. It's likely a misconception stemming from a UI that automatically creates categories from past searches.

You can't turn Apple's image recognition feature off, which isn't great, but it's still a safer system than Google Photos' version, which stores images on Google's servers.

For those who do want explicit photos moved into a separate (locked) folder on their device, there's always Nude: the sexiest app ever, which uses image recognition to automatically identify naughty pictures.