![]() ![]() “Think of a photo filter or editing app that you might grant access to your albums,” Lily Hay Newman wrote for Wired. (However, as Melanie Ehrenkranz wrote for Gizmodo, Nude’s algorithm is a little overeager.)Īpple has rules in place to protect developers from exploiting that data maliciously, but, as Wired noted, something sketchy could sneak through. For example, the app Nude uses Core ML to scan your photos for nudity, helping you pre-emptively file away any photos you wouldn’t want wandering eyes to see. It’s certainly possible that third parties could use a similar algorithm for more nefarious purposes.įor example, a new feature in iOS 11 for app developers called “Core ML” allows programmers to use machine learning algorithms to learn all sorts of things about you based on what’s in your phone. But that paranoia isn’t totally unwarranted. Apple likes to tout this feature as something that’ll help you sort through your thousands of photos, searching for keywords rather than scrolling around endlessly. Theoretically, all those folders that your iPhone has created on your behalf are stored locally on your device. Presumably, this tweet is expressing a paranoia that a massive corporation like Apple can poke through your pictures using machine learning and fancy algorithms to know exactly what you’ve taken photos of, without you ever asking it to do so. ![]()
0 Comments
Leave a Reply. |