Yeah, it's got a nasty flavour to it. MS showed their intentions unintentionally when they messed up with Co-pilot, having it taking screenshots every few seconds or so to build up a training data set without the users knowledge of what was happening, and then when outed claiming it's only screenshots saved to the devices hard drive - but they forgot to mention what happened to that data after it was OCR'ed and analysed by the AI, and all stored where? Oh! It's in MS's data warehouses, what a surprise, must have just slipped their mind! (ho hum)
Essentially data theft in no uncertain terms, but with a neural network the data is stored in such a fashion that even if you know your data is in there, getting it out in a form that would satisfy a court is another matter altogether, and the tech/AI firms know this and that's one reason they feel fine with such 'flexible' ethics. I mean, how can it be stealing if no-one can prove it?
Personally I find it boggling (though not surprising) that people can be allowed to get away with this, for little more reason than they can get away with it, and there's lots of money to be made for the people who enabled this behaviour. Says far more about these companies than any of their mealy mouthed PR messaging tries to say.
Essentially data theft in no uncertain terms, but with a neural network the data is stored in such a fashion that even if you know your data is in there, getting it out in a form that would satisfy a court is another matter altogether, and the tech/AI firms know this and that's one reason they feel fine with such 'flexible' ethics. I mean, how can it be stealing if no-one can prove it?
Personally I find it boggling (though not surprising) that people can be allowed to get away with this, for little more reason than they can get away with it, and there's lots of money to be made for the people who enabled this behaviour. Says far more about these companies than any of their mealy mouthed PR messaging tries to say.