Gadgets
Microsoft Counterfit for Security Testing AI Systems Now Open-Source Tool
As per a blog post by Microsoft, Counterfeit is a tool to secure AI systems that are used in various industries such as healthcare, finance, and defence. Citing a survey of 28 organisations, spanning Fortune 500 companies, governments, non-profits, and small- and medium-sized businesses (SMBs), Microsoft says that it found out that 25 out of 28 businesses don’t have the right tools to secure their AI systems.
“Consumers must have confidence that the AI systems powering these important domains are secure from adversarial manipulation,” reads the blogpost.
Microsoft says that it engaged with a diverse profile of partners to test the tool against their machine learning models in their environments and to ensure that Counterfit addresses a broader set of security professionals’ needs. Counterfit is also highlighted as a tool to empower engineers to securely develop and deploy AI systems. Apart from having workflows and terminology similar to popular offensive tools, Counterfeit is said to make published attack algorithms accessible to the security community.
Microsoft also has Counterfit Git Hub repository, and is holding a walk-through as well as live tutorial on May 10. If you are a developer, or work in an organisation that wants to use the tool to secure AI systems, you can register for the webinar.