Skip to content
#

p-tuning

Here are 10 public repositories matching this topic...

Language: All
Filter by language

We unified the interfaces of instruction-tuning data (e.g., CoT data), multiple LLMs and parameter-efficient methods (e.g., lora, p-tuning) together for easy use. We welcome open-source enthusiasts to initiate any meaningful PR on this repo and integrate as many LLM related technologies as possible. 我们打造了方便研究人员上手和使用大模型等微调平台,我们欢迎开源爱好者发起任何有意义的pr!

  • Updated Dec 12, 2023
  • Jupyter Notebook

This bootcamp is designed to give NLP researchers an end-to-end overview on the fundamentals of NVIDIA NeMo framework, complete solution for building large language models. It will also have hands-on exercises complimented by tutorials, code snippets, and presentations to help researchers kick-start with NeMo LLM Service and Guardrails.

  • Updated Mar 7, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the p-tuning topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the p-tuning topic, visit your repo's landing page and select "manage topics."

Learn more