User talk:Ilmari Karonen/archives/12

Images loading twice
Hi. I have mediawiki installed in my site and recently I upgraded to 1.11. After I made a page with a lot of images I noticed that images load twice, including the bullet image for lists (navigation, toolbox, etc.) and the images for external links. (You have to reload the page in order to see it clearly.) I googled it and show your comments on this problem here. The two changes you made to wikibits on January 2007 are included in my wikibits.js. I have not made any changes to wikibits, and my common.js is empty. Any idea what could be causing this? 11:13, 21 November 2007 (UTC)


 * Hard to say. All I can be fairly sure of is that it doesn't really sound like the same problem.  I have a hunch that it might be related to the ASCIIMathML.js script you've got installed, but that's just a wild guess.  Anyway, have you checked whether it's actually loading the images twice, or just redrawing them?  (You might be able to see that from your web server logs.)  Oh, and what browser(s) are you seeing this problem on?  —Ilmari Karonen (talk) 15:50, 21 November 2007 (UTC)


 * Let me add my thanks to the rest :-) Your hunch proved dead on. It was AsciiMathML, which I had upgraded along with mediawiki. When I reverted to the previous version, the problem went away. I am off to the developers site now to ask if he has any solution. Thanks a lot! (Or would you have any "hunches" about what could be wrong with it too? ;-) )17:59, 21 November 2007 (UTC)

You seem to be thanked quite a bit
And this just makes the matters worse! ;)

Thanks for the FUR fixer script. It will save me boat loads of time dealing with images.  spryde |  talk  12:52, 21 November 2007 (UTC)

Thanks
Thanks for the help with. I appreciate it. =) -- Gogo Dodo (talk) 20:56, 28 November 2007 (UTC)

WP:AN/I thread discussion regarding chaging CSD policy
You dont think the change you have made to the policy should first be discussed on the CSD talk page? (though i do like the addition) Also it still contradicts the sentence where it says that CSD applies to all mainspace areas (which includes userpages). I personaly agree with the change, but there are still many different pages of policy that are now contradicting the change you have made to test page CSD. I would like to discuss this further with you, Cheers! Tiptoety (talk) 19:21, 29 November 2007 (UTC)

Moon Sand
A proposed deletion template has been added to the article Moon Sand, suggesting that it be deleted according to the proposed deletion process. All contributions are appreciated, but this article may not satisfy Wikipedia's criteria for inclusion, and the deletion notice should explain why (see also "What Wikipedia is not" and Wikipedia's deletion policy). You may prevent the proposed deletion by removing the  notice, but please explain why you disagree with the proposed deletion in your edit summary or on its talk page. Also, please consider improving the article to address the issues raised. Even though removing the deletion notice will prevent deletion through the proposed deletion process, the article may still be deleted if it matches any of the speedy deletion criteria or it can be sent to Articles for Deletion, where it may be deleted if consensus to delete is reached. If you agree with the deletion of the article, and you are the only person who has made substantial edits to the page, please add  to the top of Moon Sand. Kannie | talk 03:55, 20 December 2007 (UTC)

Redirect of Moonsand
Hello, this is a message from an automated bot. A tag has been placed on Moonsand, by another Wikipedia user, requesting that it be speedily deleted from Wikipedia. The tag claims that it should be speedily deleted because Moonsand is a redirect to a non-existent page (CSD R1). To contest the tagging and request that administrators wait before possibly deleting Moonsand, please affix the template to the page, and put a note on its talk page. If the article has already been deleted, see the advice and instructions at WP:WMD. Feel free to contact the bot operator if you have any questions about this or any problems with this bot, bearing in mind that '''this bot is only informing you of the nomination for speedy deletion; it does not perform any nominations or deletions itself. To see the user who deleted the page, click here''' CSDWarnBot (talk) 11:01, 21 December 2007 (UTC)

Image system proposal
I saw you authored the image script at User talk:Ilmari Karonen/nfurbacklink.js. I've worked up a new script idea and image uploader idea here User:Mbisanz/ImageSystemProposal, but don't know coding to create such things. I'd appreciate any comments or help you could provide. Also, if you could write such a script, I'd be willing to host it and take the responsibility (read: complaints, blame, flame) for it. Mbisanz (talk) 06:10, 29 December 2007 (UTC)

Quick wikify
You commented almost a year ago: "wouldn't it make more sense to just check for wgCanonicalNamespace != '', or am I missing something?" The purpose is to stop execution (and, therefore, not add the button) if we're on a user or user talk page. -rayluT 07:30, 1 January 2008 (UTC)


 * Yes, I can see that. What I don't see is why those two namespaces should be excluded while others, such as "Talk" or "Wikipedia", aren't.  I suppose there may well be a good reason, but I can't really think of one.  —Ilmari Karonen (talk) 16:04, 2 January 2008 (UTC)

Restored page
Thanks for restoring and moving User:Byeboer wa/Byeboerdery. Byeboer (talk) 11:16, 6 January 2008 (UTC)

Traceless Biometrics
As with many rapidly expanding technologies that affect social life, biometrics has in a justifiable manner come under attack by civil libertarians. Privacy advocates argue that biometrics will lead to an even deeper erosion of personal privacy in both the real world and cyber-space. There are Many privacy concerns which have emerged following the increase in use and the popularity of biometric systems for identification and authentication purposes in digital and physical environments.

Biometrics must uncompromisingly be completely traceless and noninvasive with regard to personal privacy.

The U.S. Constitution does not explicitly guarantee a right to privacy. Privacy of personal data has traditionally been protected in two ways: through self-regulatory codes and through laws. If one biometrics system were widely adopted, say fingerprinting, the many databases containing the digitized versions of the prints could be combined. While such a system is most likely to be developed by the commercial sector for use in financial transactions, government and law enforcement authorities would likely want to take advantage of these massive databases for other purposes, especially if we were to enter a time of social unrest. Indeed, government agencies and law enforcement are the top subscribers to the many databases compiled by private sector ‘information brokers’. Privacy laws and policy in the United States were derived from a code of fair information practices developed in 1973 by the U.S. Department of Health Education and Welfare. This Code is ‘an organized set of values and standards about personal information defining the rights of record subjects and the responsibilities of record keepers.’ The Code highlights five principles of fair information practices:

1.	There must be no secret personal data record-keeping system.

2.	There must be a way for individuals to discover what personal information is recorded   about them and how it is used.

3.	There must be a way for individuals to prevent personal information obtained for one purpose from being used or made          available for other purposes without their consent.

4.	There must be a way for individuals to correct or amend information about themselves.

News stories of Internet privacy threats are commonplace these days. The Internet was designed as an inherently insecure communications vehicle. - Hackers have easily penetrated the most secure facilities of the military and financial institutions. - Internet companies have designed numerous ways to track Web users as they travel and shop throughout cyberspace. ‘Cookie’ is no longer a word associated solely with sweets. It is now associated with cyber-snooping. - Identity thieves are able to shop online anonymously using the credit-identities of others. - Web-based ‘information brokers’ sell sensitive personal data, including Social Security numbers, relatively cheaply.

A long-time goal of computer scientists, specifically those specializing in Artificial Intelligence, has been to create computer systems that are able to simulate human intelligence. At the same time, researchers have continually been concerned with improving the identification and authentication methods used for access to computer systems and networks. Biometric authentication systems are a natural extension (to computers) of the recognition methods that humans have used since the beginning of time. In these systems, physical or behavioral characteristics of the person to be authenticated determine whether he is indeed who he declared himself to be - this is analogous to how people recognize each other (i.e. how they identify others and verify that the person is who he appears to be) by examining physical features that are essentially unique to the other person, like his face.

The Case against Biometrics:Critics argue that biometric authentication methods present a serious threat to privacy rights. These arguments have been broken down into three categories:

1)	Anonymity

2)	Tracking and surveillance

3)	Data matching and profiling

Privacy advocates argue that individuals lose their anonymity in any system or digital environment that uses biometric authentication methods. Many people claim the option of anonymity in the marketplace (for electronic purchases) and in the political arena (for voting) as part of their expectation of privacy. Critics of biometrics feel that if this traceable technology were to gain widespread acceptance and proliferate further into daily life, then much of our anonymity, when we use different services, and move from place to place will fade. Privacy advocates envision traceable biometrics as being able to foster ‘Big-Brother’ monitoring of citizens by the State. This idea stems from the fact that traceable biometric measures can be used as universal identifiers for individuals because each biometric measure is unique. Consider having a driver's license with a magnetic strip that stored one's fingerprint. One could imagine being pulled over by a traffic policeman for a trivial traffic violation, and being subject to harsh treatment because after scanning your fingerprint in, the police officer has access to your entire criminal record and knows all of your past offenses. Governments have used technology to intrude into the interior of individuals' privacy-circle. Critics of traceable biometrics argue that there is no reason to expect that the State will use traceable biometric technologies any differently. Isolated identifying and non-identifying information in different databases can be used to create extensive records that profile peoples’ shopping and spending habits. The biggest danger of traceable biometrics according to privacy advocates, is that traceable biometric identifiers can be linked to databases of other information that people do not want dispersed. The threat to privacy arises from “the ability of third parties to access this data in identifiable form and link it to other information, resulting in secondary uses of the information, without the consent of the data subject.” This would be a violation of the Code of Fair Information Practices, since the individual would no longer have control over the dissemination of his personal information. People have generally frowned on biometrics, in particular fingerprints and face recognition systems, because of the long association with criminal identification, and more recently because of its use in State welfare schemes to prevent recipients from making double claims on their benefits. The argument is that people are reduced to mere codes and are subject to inanimate, unjust treatment. A similar argument against the use of biometrics is that traceable biometric identifiers are an "example of the state's using technology to reduce individuality." This type of identification corrupts the relationship between citizen and state because it empowers the state with control over its citizens. Religious groups argue that traceable biometric authentication methods are “the mechanism foretold in religious prophecy” (e.g. the Mark of the Beast). Further religious objections are based on the premise that individuals must give up themselves, or part of themselves, to a symbol of authority which has no spiritual significance. Though there are no documented cases of biometric technologies causing actual physical harm to users, certain methods are considered as invasive. For example, retina scanning requires the user to place his eye as close as three inches away from the scanner so that it can capture an image of his retina pattern. Fingerprint recognition devices too are deemed as invasive because they require the user to actually touch a pad.

What remains to be determined is the following:

1.	Can the biometric information be collected, stored, or retrieved?

2.	Can the biometric information collected be used both for criminal and non-criminal searches and suspicionless searches?

3.	Can the system give the individual full control over his abandoned personal intrinsic information?

The following fact remains: there are no legal restrictions on biometrically identifying information, or biometric authentication systems. However: there are severe restrictions on collecting, creating, maintaining, using, or disseminating records of identifiable personal data. One immediate conclusion that we should draw is that biometrics authentication must be traceless.

Traceless biometrics must clearly authenticate users’ identity without requiring the storage of any unique biometric information. Furthermore, the traceless biometrics solution should not link, write, or bind any unique information to an external device, smart card, or network of any kind and in the same time be able to positively recognize and identify biometric identity only in real-time without violating the user’s privacy and without leaving any intrinsic traces i any external system.

The main goal of the required solution is to demonstrate how traceless (non-unique) biometric systems can themselves be advocates of privacy, by answering the following questions: 1) How can traceless biometric systems be designed so as not to intrude into personal data sets? 2) How can government intervention through legislation guarantee privacy protection of users by adopting and enforcing the new traceless biometric authentication and identification systems? 3) In the absence of government regulation, how much reliance can users of biometric systems have on self-regulation for privacy protection?

Privacy and security are not the same: Roger Clarke of the Faculty of Engineering and Information Technology at the Australian National University explains privacy as "the interest that individuals have in sustaining a 'personal space', free from interference by other people and organizations." Clarke defines several dimensions to this interest. The two that are most relevant to this White Paper are: 1) Privacy of personal communications. "Individuals claim an interest in being able to communicate among themselves using various media without routine monitoring of their communications by other persons or organizations." 2) Privacy of personal data. "Individuals claim that data about themselves should not be automatically available to other individuals and organizations, and that, even where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use." In other words, users of computer systems (especially those in networked environments) expect that those who store their personal information will not abuse it. They expect too that wherever their personal information is being stored, it is safe, so even if a hacker were to succeed in breaking into the computer or server on which this data were stored, it would be protected. Users expect also to be able to communicate anonymously. This is especially important for those who want to criticize the government, or an employer without having to worry about victimization.

Biometrics violating privacy and is harmfully traceable:In the context of biometrics, privacy is a central issue because any biometric information about a person necessarily falls within the boundary of the privacy-circle. Hence, individuals are concerned about how any biometrically identifying information about them is controlled. Biometric properties from the perspective of traces or permanent storage can lead to undesired identification and tracing of the activities of an individual. Even if the biometric data is stored in an altered form that requires a complex algorithm to decipher, the uniqueness of the biometrics specimen, the speed and computational power available today makes any such protection scheme irrelevant.

Biometrics must benefit third-party trust: If unique biometric properties are stored somewhere, for example on a smart card or on a computer system, even if it is stored in an encoded, scrambled or ciphered form, it is still a unique biometric identifier. Once a unique biometric identifier has being stored anywhere, at any time, on any external media (including media that is associated with the boundaries of the individual, such as a smartcard held by the individual), the privacy of that biometric property owner is violated. As noted previously, exposing or losing biometric property is a permanent problem for the life of the individual, since there is no way to change the physiological or behavioral characteristics of the individual. Biometric technology is inherently individuating and interfaces easily to database technology, making privacy violations easier and more damaging.

Who can you trust? It may seem that one of the issues that plagues card-based ID systems the security or integrity of the card itself -- does not apply for biometric systems, because ‘you are your ID.’ But the question of the reliability of the card is really a question about trust. In an ID card system, the question is whether the system can trust the card. In a biometric system, the question is whether the individual can trust the system. If someone else captures an individual’s physiological signature, fingerprint, or voice print for instance, abuse by others is difficult to prevent. Any use of biometrics with a scanner run by someone else involves trusting someone's claim about what the scanner does and how the captured information will be used. Vendors and scanner operators may say that they protect privacy in some way, perhaps by hashing the biometric data or designing the database to enforce a privacy policy. But the end user typically has no way to verify whether such technical protections are effective or implemented properly. End users should be able to verify any such claims, and to leave the system completely if they are not satisfied. Exiting the system should at least include expunging the end user's biometric data and records. Despite these concerns, political pressure for more deployment of biometrics is increasing. Much U.S. federal attention is devoted to deploying biometrics for border security. This is an easy sell, because immigrants and foreigners are politically speaking, easy targets. But once a system is created, new uses are usually found for it, and those uses are not likely to stop at the border.

—Preceding unsigned comment added by MichaShafir (talk • contribs) 11:47, 10 January 2008 (UTC)


 * I'm not really sure why you posted this on my talk page, but it doesn't really belong here; User talk pages are meant for discussion between Wikipedia users, not for posting long essays.


 * In any case, if the term "traceless biometrics" has not received sufficient coverage in reliable sources to even establish a generally accepted definition, it probably does not belong on Wikipedia. As I noted before, I think it might deserve a brief mention in the biometrics article, but for now, an in-depth discussion of such an emerging topic does not (yet) belong here.  —Ilmari Karonen (talk) 04:58, 21 January 2008 (UTC)

Disputed fair use rationale for Image:Recycle-resin-logos-lr 01.png
Thanks for uploading Image:Recycle-resin-logos-lr 01.png. However, there is a concern that the rationale you have provided for using this image under "fair use" may be invalid. Please read the instructions at Non-free content carefully, then go to the image description page and clarify why you think the image qualifies for fair use. Using one of the templates at Fair use rationale guideline is an easy way to ensure that your image is in compliance with Wikipedia policy, but remember that you must complete the template. Do not simply insert a blank template on an image page.

If it is determined that the image does not qualify under fair use, it will be deleted within a couple of days according to our criteria for speedy deletion. If you have any questions please ask them at the media copyright questions page. Thank you.BetacommandBot (talk) 07:56, 21 January 2008 (UTC)

"fake move attack" comes back again

 * (See Archive118 Please) This guy comes back again, he uses ip 64.24.84.3 to add information of a fake movie Tom & Jerry: The Great Beginning into Barney Bear, thanks for giving a hand for this.123.193.12.44 (talk) 23:01, 26 January 2008 (UTC)

POTD notification
Hi Ilmari,

Just to let you know that the Featured Picture Image:1882 Kingston Fire.png is due to make an appearance as Picture of the Day on February 16, 2008. If you get a chance, you can check and improve the caption at Template:POTD/2008-02-16.  howcheng  {chat} 18:11, 5 February 2008 (UTC)

WP:OP
Hey, can you please help with the backlog at WikiProject on open proxies? (You're an admin on the verified users list). Calvin 1998  Talk   Contribs  04:18, 8 February 2008 (UTC)