BillionaireClubCollc
  • News
  • Notifications
  • Shop
  • Cart
  • Media
  • Advertise with Us
  • Profile
  • Groups
  • Games
  • My Story
  • Chat
  • Contact Us
home shop notifications more
Signin
  •  Profile
  •  Sign Out
Skip to content

Billionaire Club Co LLC

Believe It and You Will Achieve It

Primary Menu
  • Home
  • Politics
  • TSR
  • Anime
  • Michael Jordan vs.Lebron James
  • Crypto
  • Soccer
  • Dating
  • Airplanes
  • Forex
  • Tax
  • New Movies Coming Soon
  • Games
  • CRYPTO INSURANCE
  • Sport
  • MEMES
  • K-POP
  • AI
  • The Bahamas
  • Digital NoMad
  • Joke of the Day
  • RapVerse
  • Stocks
  • SPORTS BETTING
  • Glamour
  • Beauty
  • Travel
  • Celebrity Net Worth
  • TMZ
  • Lotto
  • COVD-19
  • Fitness
  • The Bible is REAL
  • OutDoor Activity
  • Lifestyle
  • Culture
  • Boxing
  • Food
  • LGBTQ
  • Poetry
  • Music
  • Misc
  • Open Source
  • NASA
  • Science
  • Natural & Holstict Med
  • Gardening
  • DYI
  • History
  • Art
  • Education
  • Pets
  • Aliens
  • Astrology
  • Farming and LiveStock
  • LAW
  • Fast & Furious
  • Fishing & Hunting
  • Health
  • Credit Repair
  • Grants
  • All things legal
  • Reality TV
  • Africa Today
  • China Today
  • "DUMB SHIT.."
  • CRYPTO INSURANCE

Hyperparameter Optimization in Novel Class Discovery

:::info
Authors:
(1) Troisemaine Colin, Department of Computer Science, IMT Atlantique, Brest, France., and Orange Labs, Lannion, France;
(2) Reiffers-Masson Alexandre, Department of Computer Science, IMT Atlantique, Brest, France.;
(3) Gosselin Stephane, Orange Labs, Lannion, France;
(4) Lemaire Vincent, Orange Labs, Lannion, France;
(5) Vaton Sandrine, Department of Computer Science, IMT Atlantique, Brest, France.
:::
Table of Links
Abstract and Intro
Related work
Approaches
Hyperparameter optimization
Estimating the number of novel classes
Full training procedure
Experiments
Conclusion
Declarations
References
Appendix A: Additional result metrics
Appendix B: Hyperparameters
Appendix C: Cluster Validity Indices numerical results
Appendix D: NCD k-means centroids convergence study
4 Hyperparameter optimization
The success of machine learning algorithms (including NCD) can be attributed in part to the high flexibility induced by their hyperparameters. In most cases, a target is available and approaches such as the k-fold Cross-Validation (CV) can be employed to tune the hyperparameters and achieve optimal results. However, in a realistic scenario of Novel Class Discovery, the labels of the novel classes are never available. We must therefore find a way to optimize hyperparameters without ever relying on the labels of the novel classes. In this section, we present a method that leverages the known classes to find hyperparameters applicable to the novel classes. This tuning method is designed specifically for NCD algorithms that require both labeled data (known classes) and unlabeled data (novel classes) during training[1]. This is the case for Projection-based NCD, as described in Section 3.4.
\

To illustrate, in the split 1 of Figure 4, the model will be trained with the subsets of classes {C2, C3, C4} as known classes and {C0, C1, C5, . . . , C9} as novel classes. It will be evaluated for its performance on the hidden classes {C0, C1} only.
\
To evaluate a given combination of hyperparameters, this approach is applied to all the splits, and the performance on the hidden classes is averaged. After repeating this process for many combinations, the combination that achieved the best performance is selected. For the final evaluation on the novel classes, in a realistic scenario of NCD their labels are never available. However, in the datasets employed in this article, the novel classes are comprised of pre-defined classes. Therefore, even though these labels
\

\
are not employed during training, they can still be used to assess the final performance on the novel classes of different models and compare them against each other.
\

\
In Table 1, we report for all datasets used in our experiments the number of known classes that are hidden in each split, as well as the number of splits. Note that when the number of known classes is small (e.g. 3 for Human), this approach may be difficult to apply.
\
Discussion. Similarly to NCD, there are no labels available in unsupervised clustering problems, which makes the task of hyperparameter selection very difficult. To address this issue, clustering algorithms are sometimes tuned using internal metrics that do not rely on labeled data for computation. These metrics offer a means of comparing the results obtained from different clustering approaches. Examples of such metrics include the Silhouette coefficient, Davies-Bouldin index, or Calinski-Harabasz index [33]. However, it is important to note that these metrics make assumptions about the structure of the data and can be biased towards algorithms which make a similar assumption. But unlike unsupervised clustering, the NCD setting provides known classes that are related to the novel classes we are trying to cluster.
\

\
:::info
This paper is available on arxiv under CC 4.0 license.
:::

[1] To optimize purely unsupervised clustering methods for NCD, we refer the reader to the optimization process of Section 3.3.

Welcome to Billionaire Club Co LLC, your gateway to a brand-new social media experience! Sign up today and dive into over 10,000 fresh daily articles and videos curated just for your enjoyment. Enjoy the ad free experience, unlimited content interactions, and get that coveted blue check verification—all for just $1 a month!

Source link

Share
What's your thought on the article, write a comment
0 Comments
×

Sign In to perform this Activity

Sign in
×

Account Frozen

Your account is frozen. You can still view content but cannot interact with it.

Please go to your settings to update your account status.

Open Profile Settings

Ads

  • Billionaire128 Liquid Gold Men’s Athletic Long Shorts

    $ 40.00
  • Original Billionaire128 iPhone Case

    $ 15.50
  • Premium Billionaire128 Men’s Athletic Long Shorts

    $ 40.00
  • News Social

    • Facebook
    • Twitter
    • Facebook
    • Twitter
    Copyright © 2024 Billionaire Club Co LLC. All rights reserved