Shortcuts

quaterion.loss.contrastive_loss module

class ContrastiveLoss(distance_metric_name: Distance = Distance.COSINE, margin: float = 0.5, size_average: bool = True)[source]

Bases: PairwiseLoss

Contrastive loss.

Expects as input two texts and a label of either 0 or 1. If the label == 1, then the distance between the two embeddings is reduced. If the label == 0, then the distance between the embeddings is increased.

Further information:

http://yann.lecun.com/exdb/publis/pdf/hadsell-chopra-lecun-06.pdf

Parameters:
  • distance_metric_name – Name of the function, e.g., Distance. Optional, defaults to COSINE.

  • margin – Negative samples (label == 0) should have a distance of at least the margin value.

  • size_average – Average by the size of the mini-batch.

forward(embeddings: Tensor, pairs: LongTensor, labels: Tensor, subgroups: Tensor, **kwargs) Tensor[source]

Compute loss value.

Parameters:
  • embeddings – Batch of embeddings, first half of embeddings are embeddings of first objects in pairs, second half are embeddings of second objects in pairs.

  • pairs – Indices of corresponding objects in pairs.

  • labels – Scores of positive and negative objects.

  • subgroups – subgroups to distinguish objects which can and cannot be used as negative examples

  • **kwargs – additional key-word arguments for generalization of loss call

Returns:

Tensor – averaged or summed loss value

get_config_dict() Dict[str, Any][source]

Config used in saving and loading purposes.

Config object has to be JSON-serializable.

Returns:

Dict[str, Any] – JSON-serializable dict of params

training: bool

Qdrant

Learn more about Qdrant vector search project and ecosystem

Discover Qdrant

Similarity Learning

Explore practical problem solving with Similarity Learning

Learn Similarity Learning

Community

Find people dealing with similar problems and get answers to your questions

Join Community