Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content
#

pretraining

Here are 180 public repositories matching this topic...

SparK

[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"

  • Updated Jan 23, 2024
  • Python
Awesome-Robotics-3D

A curated list of 3D Vision papers relating to Robotics domain in the era of large models i.e. LLMs/VLMs, inspired by awesome-computer-vision, including papers, codes, and related websites

  • Updated Nov 4, 2024

Personal Project: MPP-Qwen14B & MPP-Qwen-Next(Multimodal Pipeline Parallel based on Qwen-LM). Support [video/image/multi-image] {sft/conversations}. Don't let the poverty limit your imagination! Train your own 8B/14B LLaVA-training-like MLLM on RTX3090/4090 24GB.

  • Updated Dec 9, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the pretraining topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the pretraining topic, visit your repo's landing page and select "manage topics."

Learn more