GLORIA

GEOMAR Library Ocean Research Information Access

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    MIT Press ; 2023
    In:  Neural Computation Vol. 35, No. 7 ( 2023-06-12), p. 1288-1339
    In: Neural Computation, MIT Press, Vol. 35, No. 7 ( 2023-06-12), p. 1288-1339
    Abstract: We consider the scenario of deep clustering, in which the available prior knowledge is limited. In this scenario, few existing state-of-the-art deep clustering methods can perform well for both noncomplex topology and complex topology data sets. To address the problem, we propose a constraint utilizing symmetric InfoNCE, which helps an objective of the deep clustering method in the scenario of training the model so as to be efficient for not only noncomplex topology but also complex topology data sets. Additionally, we provide several theoretical explanations of the reason that the constraint can enhances the performance of deep clustering methods. To confirm the effectiveness of the proposed constraint, we introduce a deep clustering method named MIST, which is a combination of an existing deep clustering method and our constraint. Our numerical experiments via MIST demonstrate that the constraint is effective. In addition, MIST outperforms other state-of-the-art deep clustering methods for most of the commonly used 10 benchmark data sets.
    Type of Medium: Online Resource
    ISSN: 0899-7667 , 1530-888X
    Language: English
    Publisher: MIT Press
    Publication Date: 2023
    detail.hit.zdb_id: 1025692-1
    detail.hit.zdb_id: 1498403-9
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...