Meta’s foundational model for computer vision is now open source



summary
Summary

  • Meta releases DINOv2 as open source under the Apache 2.0 license.
  • Meta also introduces FACET (FAirness in Computer Vision EvaluaTion), a benchmark for bias in computer vision models.

Update, August 31, 2023:

Meta releases its computer vision model DINOv2 under the Apache 2.0 license to give developers and researchers more flexibility for downstream tasks. Meta also releases a collection of DINOv2-based dense prediction models for semantic image segmentation and monocular depth estimation.

Meta also introduces FACET, a benchmark for evaluating the fairness of computer vision models in tasks such as classification and segmentation. The dataset includes 32,000 images of 50,000 people, with demographic attributes such as perceived gender and age group, in addition to physical features.

FACET is intended to become a standard benchmark for evaluating the fairness of computer vision models and to encourage the design and development of models that take more people into account.

Ad

Ad

Original article from April 18, 2023:

Metas DINOv2 is a foundation model for computer vision. The company shows its strengths and wants to combine DINOv2 with large language models.

In May 2021, AI researchers at Meta presented DINO (Self-Distillation with no labels), a self-supervised trained AI model for image tasks such as classification or segmentation. With DINOv2, Meta is now releasing a significantly improved version.

Like DINO, DINOv2 is a computer vision model trained using self-supervised learning, and according to Meta, it performs as well as or better than most of today’s specialized systems on all tasks benchmarked. Due to self-supervised learning, no labeled data is required, and the DINO models can be trained on large unlabeled image datasets.

Video: Meta

Recommendation

demos for DINOv2. Code and checkpoints are available on Github.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top