Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content
#

qat

Here are 30 public repositories matching this topic...

Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers advanced quantization and compression tools for deploying state-of-the-art neural networks.

  • Updated Nov 12, 2024
  • Python

mi-optimize is a versatile tool designed for the quantization and evaluation of large language models (LLMs). The library's seamless integration of various quantization methods and evaluation techniques empowers users to customize their approaches according to specific requirements and constraints, providing a high level of flexibility.

  • Updated Nov 1, 2024
  • Python

The project focuses on Intel’s enterprise AI and cloud native foundation for Red Hat OpenShift Container Platform (RHOCP) solution enablement and innovation including Intel data center hardware features, Intel technology enhanced AI platform and the referenced AI workloads provisioning for OpenShift.

  • Updated Nov 6, 2024
  • Python

Improve this page

Add a description, image, and links to the qat topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the qat topic, visit your repo's landing page and select "manage topics."

Learn more