Monitoring#

Here we describe the available monitors in Argilla:

Base Monitor#

class argilla.monitoring.base.BaseMonitor(*args, api: Argilla, dataset: str, sample_rate: float = 1.0, log_interval: float = 1.0, agent: Optional[str] = None, tags: Optional[Dict[str, str]] = None, **kwargs)#

A base monitor class for easy task model monitoring

Attributes:#

dataset:

argilla dataset name

sample_rate:

The portion of the data to store in argilla. Default = 0.2

is_record_accepted() bool#

Return True if a record should be logged to argilla

shutdown()#

Stop consumers

class argilla.monitoring.base.DatasetRecordsConsumer(name: str, api: Argilla, tags: Optional[dict] = None, metadata: Optional[dict] = None, buffer_size: int = 10000, upload_size=256, upload_interval=1.0, retries=10, timeout=15, on_error=None)#

Consumes the records from the dataset queue.

log_next_batch()#

Upload the next batch of items, return whether successful.

pause()#

Pause the consumer.

run()#

Runs the consumer.

send(records: Iterable[Union[TextClassificationRecord, TokenClassificationRecord, Text2TextRecord, TextGenerationRecord]])#

Send records to the consumer

exception argilla.monitoring.base.ModelNotSupportedError#

ArgillaLogHTTPMiddleware#

class argilla.monitoring.asgi.ArgillaLogHTTPMiddleware(api_endpoint: str, dataset: str, records_mapper: Optional[Callable[[dict, dict], Union[TextClassificationRecord, TokenClassificationRecord, Text2TextRecord, TextGenerationRecord]]], sample_rate: float = 1.0, log_interval: float = 1.0, agent: Optional[str] = None, tags: Optional[Dict[str, str]] = None, *args, **kwargs)#

An standard starlette middleware that enables argilla logs for http prediction requests

class argilla.monitoring.asgi.CachedJsonRequest(scope: ~typing.MutableMapping[str, ~typing.Any], receive: ~typing.Callable[[], ~typing.Awaitable[~typing.MutableMapping[str, ~typing.Any]]] = <function empty_receive>, send: ~typing.Callable[[~typing.MutableMapping[str, ~typing.Any]], ~typing.Awaitable[None]] = <function empty_send>)#

We must a cached version of incoming requests since request body cannot be read from middleware directly. See <https://github.com/encode/starlette/issues/847> for more information

TODO Remove usage of CachedRequest when https://github.com/encode/starlette/pull/848 is released

Framework Monitors#

argilla.monitoring.model_monitor.monitor(task_model: MissingType, dataset: str, sample_rate: float = 0.3, agent: Optional[str] = None, log_interval: float = 5) Union[BaseMonitor, MissingType]#

Automatically monitor (i.e. log) data fed through Transformer pipelines, spaCy models or flAIr taggers.

Parameters:
  • task_model (Union[Language, Pipeline, SequenceTagger]) – The spaCy Language, transformers Pipeline or flAIr SequenceTagger.

  • dataset (str) – The Argilla dataset to log data into.

  • sample_rate (float, optional) – The portion of processed data to log. Defaults to 0.3.

  • agent (Optional[str], optional) – The name of the logging agent. Defaults to None.

  • log_interval (float, optional) – The interval for uploading in seconds. Defaults to 5.

Returns:

The monitor that acts equivalently

to the input task_model.

Return type:

Union[BaseMonitor, Language, Pipeline, SequenceTagger]

Transformers Monitor#

class argilla.monitoring._transformers.HuggingFaceMonitor(*args, api: Argilla, dataset: str, sample_rate: float = 1.0, log_interval: float = 1.0, agent: Optional[str] = None, tags: Optional[Dict[str, str]] = None, **kwargs)#
class argilla.monitoring._transformers.TextClassificationMonitor(*args, api: Argilla, dataset: str, sample_rate: float = 1.0, log_interval: float = 1.0, agent: Optional[str] = None, tags: Optional[Dict[str, str]] = None, **kwargs)#

Configures monitoring over Hugging Face text classification pipelines

class argilla.monitoring._transformers.ZeroShotMonitor(*args, api: Argilla, dataset: str, sample_rate: float = 1.0, log_interval: float = 1.0, agent: Optional[str] = None, tags: Optional[Dict[str, str]] = None, **kwargs)#

spaCy Monitor#

class argilla.monitoring._spacy.SpacyNERMonitor(*args, api: Argilla, dataset: str, sample_rate: float = 1.0, log_interval: float = 1.0, agent: Optional[str] = None, tags: Optional[Dict[str, str]] = None, **kwargs)#

A spaCy Language wrapper for NLP NER monitoring in argilla

static doc2token_classification(doc: MissingType, agent: str, metadata: Optional[Dict[str, Any]]) TokenClassificationRecord#

Converts a spaCy Doc into a token classification record

Parameters:
  • doc – The spacy doc

  • agent – Agent to use for the prediction_agent field. Could be the model path or model lang + model version

  • metadata – Passed on to the argilla.TokenClassificationRecord.

Flair Monitor#

class argilla.monitoring._flair.FlairMonitor(*args, api: Argilla, dataset: str, sample_rate: float = 1.0, log_interval: float = 1.0, agent: Optional[str] = None, tags: Optional[Dict[str, str]] = None, **kwargs)#