Eleven years ago, "Sour Pickles" was presented by Marco Slaviero. Python docs already said pickles were insecure at that time. But since then, machine learning frameworks started saving models in pickled formats as well. So, I will show how simple it is to add a backdoor into any pickled object using machine learning models as an example. As well as an example of how to securely save a model to prevent malicious code from being injected into it.
Get notified about new features and conference additions.