π’ Open-Source Evaluation & Testing for ML & LLM systems
-
Updated
Nov 12, 2024 - Python
π’ Open-Source Evaluation & Testing for ML & LLM systems
A Python package to assess and improve fairness of machine learning models.
The Python Risk Identification Tool for generative AI (PyRIT) is an open access automation framework to empower security professionals and machine learning engineers to proactively find risks in their generative AI systems.
moDel Agnostic Language for Exploration and eXplanation
Deliver safe & effective language models
A toolkit that streamlines and automates the generation of model cards
[ICCV 2023 Oral, Best Paper Finalist] ITI-GEN: Inclusive Text-to-Image Generation
PyTorch package to train and audit ML models for Individual Fairness
LangFair is a Python library for conducting use-case level LLM bias and fairness assessments
Credo AI Lens is a comprehensive assessment framework for AI systems. Lens standardizes model and data assessment, and acts as a central gateway to assessments created in the open source community.
Oracle Guardian AI Open Source Project is a library consisting of tools to assess fairness/bias and privacy of machine learning models and data sets.
Official code of "StyleT2I: Toward Compositional and High-Fidelity Text-to-Image Synthesis" (CVPR 2022)
Open-source toolkit to help companies implement responsible AI workflows.
Official code of "Discover and Mitigate Unknown Biases with Debiasing Alternate Networks" (ECCV 2022)
Package for evaluating the performance of methods which aim to increase fairness, accountability and/or transparency
FairBatch: Batch Selection for Model Fairness (ICLR 2021)
Official code of "Discover the Unknown Biased Attribute of an Image Classifier" (ICCV 2021)
Explore/examine/explain/expose your model with the explabox!
π€π‘οΈπππ Tiny package designed to support red teams and penetration testers in exploiting large language model AI solutions.
An in-depth performance profiling library for machine learning models
Add a description, image, and links to the responsible-ai topic page so that developers can more easily learn about it.
To associate your repository with the responsible-ai topic, visit your repo's landing page and select "manage topics."