Talk:Semi-supervised learning

What is
What is the difference between ‘transductive learning’ and ‘semi-supervised learning’?

Both use a mix of labeled and unlabeled examples, but the performance of the former is measured only on the unlabeled examples; it is a finite task. The performance of the latter is measure in the same way as in supervised learning, that is the expected performance wrt the distribution of the population the training set is sampled from. See also https://mitpress.mit.edu/sites/default/files/titles/content/9780262033589_sch_0001.pdf 1.2.4. The Wikipedia entry takes the alternative view that semisupervised includes transductive and inductive learning. The latter is clearly wrong as fully supervised learning is also inductive (aiming for expected good performance over a potentially infinite population. Personally I see transductive as a special case of inductive, where the full (finite) population is specified by enumeration, and a subset thereof is labelled. There is nothing in inductive learning that requires populations to be infinite, it's just a very common case. — Preceding unsigned comment added by Piccolbo (talk • contribs) 03:10, 31 August 2017 (UTC)

Self-taught learning
The paper "Self-taught learning: transfer learning from unlabeled data" by Raina et al. presents self-taught learning. It uses unlabeled data to improve predictions made via supervised learning. However, in contrary to other semi-supervised methods, it does not make the assumption that the unlabeled data's actual classes correspond to the ones given in the labeled data set. Instead, only higher-level features are extracted from the unlabeled data.

Would this approach still be considered semi-supervised learning? — Preceding unsigned comment added by 188.74.81.25 (talk) 00:10, 19 May 2012 (UTC)

Generative models
My understanding was that Generative models directly model the join probability distribution? 'Generative approaches to statistical learning first seek to estimate $$p(x|y)$$, the distribution of data points belonging to each class. '

Response) This is true; i.e., it is often used to refer to P(x,y) specifically, however, many people use the term 'generative model' to also refer to P(x|y). Either is fine so long as the context/meaning is understood.  Even the Wiki page on generative models has a section mentioning this, and there are plenty of references out there and examples in the literature using each of the two meanings that someone with some time on their hands that cares to get the dispute claim removed could do so.  At least my 2 cents.  — Preceding unsigned comment added by 65.158.32.123 (talk) 14:31, 2 September 2018 (UTC)

Modelling the joint distribution P(x,y) are known as generative models. Determine the conditional probability p(y|x) is a discriminative modelling. Also, the definition can be seen in Wikipedia for the terms "Generative Models" and "Discriminative model". If Method section in the Semi-supervised learning concept is not updated, then we have a contradiction of terms even in Wikipedia. — Preceding unsigned comment added by 131.155.186.255 (talk) 13:56, 15 November 2018 (UTC)

Merge with Weak supervision?
Is it appropriate to merge this article into the article on Weak Supervision? My first concern is that we don't have strong sources for the distinction between these two terms of art, and they may in fact be used interchangeably in much of the literature. However, even if it is decided that that is not the case, we claim here that one term is a generalization of the other, and it seems much more parsimonious to clarify that distinction inline with the article on the more general topic, and then include a section on that article for the more specific term.

130.20.195.92 (talk) 17:54, 3 August 2021 (UTC)
 * ✅ Klbrain (talk) 05:57, 30 October 2022 (UTC)

Move discussion in progress
There is a move discussion in progress on Talk:Semi-Supervised Learning which affects this page. Please participate on that page and not in this talk page section. Thank you. —RMCD bot 22:18, 22 March 2023 (UTC)