User talk:Jeblad

July 2013
Hello, I'm Amaury. I wanted to let you know that I undid one of your recent contributions, such as the one you made with this edit to THE One Total Home Experience, because it didn’t appear constructive to me. If you think I made a mistake, or if you have any questions, you can leave me a message on my talk page. Thanks. Amaury (talk) 09:40, 20 July 2013 (UTC)

ArbCom elections are now open!
MediaWiki message delivery (talk) 13:52, 23 November 2015 (UTC)

Welcome to The Wikipedia Adventure!

 * Hi Jeblad! We're so happy you wanted to play to learn, as a friendly and fun way to get into our community and mission.  I think these links might be helpful to you as you get started.
 * The Wikipedia Adventure Start Page
 * The Wikipedia Adventure Lounge
 * The Teahouse new editor help space
 * Wikipedia Help pages

-- 21:24, Friday, May 6, 2016 (UTC)

Nomination for deletion of Template:Categorize base module
Template:Categorize base module has been nominated for deletion. You are invited to comment on the discussion at the template's entry on the Templates for discussion page. Jc86035 (talk) Use &#123;&#123;re&#124;Jc86035&#125;&#125; to reply to me 16:33, 26 May 2017 (UTC)

Nomination for deletion of Template:Categorize base module with documentation
Template:Categorize base module with documentation has been nominated for deletion. You are invited to comment on the discussion at the template's entry on the Templates for discussion page. Jc86035 (talk) Use &#123;&#123;re&#124;Jc86035&#125;&#125; to reply to me 16:33, 26 May 2017 (UTC)

Nomination for deletion of Template:Categorize base module with test cases
Template:Categorize base module with test cases has been nominated for deletion. You are invited to comment on the discussion at the template's entry on the Templates for discussion page. Jc86035 (talk) Use &#123;&#123;re&#124;Jc86035&#125;&#125; to reply to me 16:33, 26 May 2017 (UTC)

Nomination for deletion of Template:Categorize documentation for base module
Template:Categorize documentation for base module has been nominated for deletion. You are invited to comment on the discussion at the template's entry on the Templates for discussion page. Jc86035 (talk) Use &#123;&#123;re&#124;Jc86035&#125;&#125; to reply to me 16:34, 26 May 2017 (UTC)

Draft:Capsule neural network
I have reworked this article and welcome your feedback. If I don't hear from you by tomorrow, I will submit it. Lfstevens (talk) 20:34, 29 December 2017 (UTC)


 * Just checked a few additions, and my first impression is that this article now contains several points that is not quite what I would have added. I'm not even sure the additions are correct. I don't think the article should be submitted in this state. Jeblad (talk) 23:45, 29 December 2017 (UTC)
 * Please make any changes you like or let me know what the issues are and I will try to address. Thx. Lfstevens (talk) 01:17, 30 December 2017 (UTC)
 * My text was in effect reverted, so I doubt that I'm going to touch that article again. Jeblad (talk) 03:33, 30 December 2017 (UTC)

Nomination for deletion of Template:Localization subpage
Template:Localization subpage has been nominated for deletion. You are invited to comment on the discussion at the template's entry on the Templates for discussion page. &#123;&#123;3x&#124;p&#125;&#125;ery (talk) 19:17, 1 May 2018 (UTC)

roman numerals
Regarding [//en.wikipedia.org/w/index.php?title=Wikipedia:Talk_pages_consultation_2019&diff=885892358&oldid=885886140&diffmode=source this edit]: just a suggestion—I think it would be easier to read if you didn't number your points in the first two sentences using roman numerals. It's not a big deal, though; I just thought I'd mention it. isaacl (talk) 01:57, 3 March 2019 (UTC)

Nomination for deletion of Module:NLG
Module:NLG has been nominated for deletion. You are invited to comment on the discussion at the module's entry on the Templates for discussion page. * Pppery * it has begun... 19:38, 5 November 2019 (UTC)

Note
Once more; Jeblad (John Erling Blad) also Agtfjott at some projects. A previous official account during my work for Wikimedia Deutscland was John Erling Blad (WMDE). Jeblad (talk) 13:38, 17 December 2019 (UTC)

Factual accuracy in Dropout article
Hello Jeblad! I saw that you placed a disputed factual accuracy template on the article for Dropout regularization. I assume you added it due to the sentence describing Google's patent of the regularization technique (please tell me if my assumption was wrong). In the note you added in the article it says that the patent is "most likely not valid," I assume due to the presence of the "dilution" technique in prior literature. Since Google's patent "System and method for addressing overfitting in a neural network" is currently active, I was wondering if you were suggesting that Google's patent is not specifically for the Dropout technique but is instead for a similar technique. Thanks for your time! Mocl125 (talk) 21:13, 10 June 2020 (UTC)


 * I wrote that it is most likely invalid. I did not suggest it was about some other technique. The article say this technique was invented by Google, and refers a patent where Hinton is named, but the technique is already described in several books and papers. Note that the patent is published in 2016, while the previous art (strong and weak dillution) was published in 1987 and 1988.
 * If you want to write an updated article, then get the original sources, and add a note about dropouts similarities with strong dillution. You may write that Google has patented “dropuouts”, in some form, but don't give the impression that they (or Hinton) where the first ones to get the idea. Google is neither the first nor the last company that tries to patent well-known machine learning techniques.
 * If you need an even older “source”, then our neurons use a kind of dropouts due to how they work. They fire with a probability less than one for each excitation, i.e. they do non-deterministic summation a.k.a dropouts. Jeblad (talk) 21:41, 10 June 2020 (UTC)


 * Thanks for your response, Jeblad! I understand what you are saying now, but I think the article says it is patented by Google and not necessarily invented by it (though the current wording may imply invention; perhaps it could be changed to "Google holds the patent to..." or something similar). Alternatively, I could move the part about Google owning the patent to a new section (perhaps "History" or something similar) which also includes a variation on the note you added in the article. I could mention that Google owns the patent but that the same/a similar technique is mentioned in literature that predates the patent.
 * What do you think? Would you have any objections to either possibility? Or we could post on the talk page to get the viewpoints of other users (though I doubt many would reply, since the article is not extremely popular). Mocl125 (talk) 22:32, 10 June 2020 (UTC)
 * The patent claim should be nothing more than a side note. The technique has been well-known for decades. Googles patent claim is fun, but mostly because it is plain rubbish and should never been granted. Write about weak and strong dilution, and make a redirect from dropouts as this is the common phrase. If you want to add a note about the patent, fine, but don't bother writing more than a paragraph. Jeblad (talk) 22:40, 10 June 2020 (UTC)
 * Over the past few hours, I have been looking into the resources you mentioned, including the book Introduction to the Theory of Neural Computation by Hertz, Krogh, and Palmer (weak and strong dilution in particular). Unfortunately, I do not think I am quite advanced enough in the field of machine learning/neural networks to be able to confidently write about strong/weak dilution on Wikipedia.
 * The edits I made did mention that there have been methods by which the severance of connections in a neural network layer have been made randomly in the past (citing the aforementioned book). I also mentioned Geoffrey Hilton's original (although I am unsure if this is really the first mention of randomly-severed connections between neurons in a neural network during the training process to prevent overfitting) paper on the the dropout technique, "Improving neural networks by preventingco-adaptation of feature detectors."
 * If you are able, I would appreciate it if you could add a bit more about weak/strong dilution into the dropout article; I just do not feel confident in my abilities to read through and discuss the technical machine learning literature yet. Thanks for your help, though! Mocl125 (talk) 04:31, 11 June 2020 (UTC)
 * It is not that uncommon to try to rephrase some technique to get a patent and monetize it. I've seen it with another network topology I reused in my own thesis. It was defined by two French researchers, but the patent is held by a South Korean. It was granted more than ten years later.
 * I can write about it, but my English isn't that good so it will be necessary to do some cleanup afterwards. Jeblad (talk) 09:50, 11 June 2020 (UTC)
 * The article contains the phrase “the dropout technique itself was introduced with this name by Geoffrey Hinton”, but I'm not sure this is correct. When I first learned about dilution it was called “dropout”. Jeblad (talk) 13:40, 11 June 2020 (UTC)
 * Thanks for the edits! No worries, I can go in and clean it up a bit. I see, so it is quite common for these techniques to not be patented by those who developed them. What I meant in the previous quotation is that the term "dropout" itself was first used in Hilton's paper (as far as I know, nobody before 2012 referred to the technique as "dropout"). I changed the wording a bit to make that more clear. Mocl125 (talk) 14:27, 11 June 2020 (UTC)
 * Eh, when I wrote “When I first learned about dilution it was called ‘dropout’”, I should have added that this was back in 1992 or -93 at University of Oslo. We used the Hertz-book. I'm not sure where the term “dropout” come from, but I somehow recall an additional book of some kind. Also, I remember the discussion about use of dilution for dampening the weights. This has implication for ordinary regularizers, but I believe it will be too involved to go into all details. Jeblad (talk) 15:56, 11 June 2020 (UTC)
 * Just finished the cleanup. The lead was also getting a bit long, so I also restructured the page and moved a bit of your lead into a new section. Do you have any objections to the removal of the factual accuracy dispute template? I also removed the article as a stub. Mocl125 (talk) 15:34, 11 June 2020 (UTC)
 * I believe it is sufficient clear now. Anyhow, it is a fact that Google has this crappy patent, but also that they weren't the first to use dilution. (I'm probably not going to get a star from Google anytime soon…) Jeblad (talk) 15:59, 11 June 2020 (UTC)
 * Just curious, I haven't included proofs in any article on Wikipedia, but I wonder if they should be added. Not to many would read them, and understand them, but they are sort of verification for mathematical statements, so they should be added. Jeblad (talk) 16:46, 11 June 2020 (UTC)
 * I think that would be useful, just to include more background. It is true that not many people would read it, but since the proof is useful information, it will help somebody (I would be personally interested in it, at least). I also don't think it would be considered too technical/advanced, since dilution/dropout is a commonly-used technique (especially nowadays), and it is not considered to be "fringe" or uncommon. If you can include the proof without it being too long (not several pages of dense mathematics), I would support it! Mocl125 (talk) 17:01, 11 June 2020 (UTC)


 * Must do something else. Will add a small section on strong dilution later. Jeblad (talk) 16:00, 11 June 2020 (UTC)
 * Sounds good. Here is the first paper I can find that uses the term "dropout" specifically (it is Hinton's 2012 paper). I think it is better now that you have cleared up that Google did not invent the technique but that it existed in the form of "dilution" earlier. Mocl125 (talk) 16:43, 11 June 2020 (UTC)

Nomination for deletion of Module:LuaDoc
Module:LuaDoc has been nominated for deletion. You are invited to comment on the discussion at the entry on the Templates for discussion page. Izno (talk) 04:42, 27 December 2021 (UTC)