Notebooks#
The notebook reference guide for Argilla tutorials.
This section contains:
- ๐พ Monitor FastAPI predictions
- ๐งฑ Extending weak supervision workflows with sentence embeddings
- Introduction
- Detailed Workflow
- Setup
- The dataset
- 1. Create a Argilla dataset with unlabelled data and test data
- 2. Defining rules
- 3. Building and analyzing weak labels
- 4. Using the weak labels
- 5. Extending the weak labels
- 6. Training a downstream model
- Summary
- Next steps
- Appendix: Visualize changes
- Appendix: Optimizing the thresholds
- Appendix: Plot extension
- ๐ Weak supervision in multi-label text classification tasks
- lets apply Weak Labeling again
- ๐ฐ Building a news classifier with weak supervision
- Introduction
- Setup
- 1. Load test and unlabelled datasets into Argilla
- 2. Define Rules
- 3. Denoise weak labels with Snorkelโs Label Model
- 4. Prepare our training set
- 5. Train a downstream model with scikit-learn
- Summary
- Next steps
- Appendix I: Create rules and weak labels from Python
- Appendix II: Log datasets to the Hugging Face Hub
- ๐ซ Zero-shot NER with Flair
- ๐ญ Weakly supervised NER with
skweak
- ๐ซ Explore and analyze
spaCy
NER pipelines - ๐ง Find label errors with cleanlab
- ๐ต๏ธโโ๏ธ Analyzing predictions with model explainability methods
- ๐งผ Clean labels using your model loss
- ๐ค Active learning with ModAL and scikit-learn
- ๐คฏ Few-shot classification with SetFit and a custom dataset
- ๐ Active learning for text classification with small-text
- ๐ท๏ธ Label your data to fine-tune a classifier with Hugging Face
- Introduction
- Setup
- Preliminaries
- 1. Run the pre-trained model over the dataset and log the predictions
- 2. Explore and label data with the pretrained model
- 3. Fine-tune the pre-trained model
- 4. Testing the fine-tuned model
- 5. Run our fine-tuned model over the dataset and log the predictions
- 6. Explore and label data with the fine-tuned model
- 7. Fine-tuning with the extended training dataset
- Summary
- Next steps