User:Yvettetiff/sandbox

Article Evaluation
 * Is everything in the article relevant to the article topic? Is there anything that distracted you?
 * I found everything in the article to be relevant to the U.S. Census Bureau. It went into enough detail that I learned more about the Bureau's history and current jobs, but not too much detail that I got distracted and lost track of what I was reading.
 * Is the article neutral? Are there any claims, or frames, that appear heavily biased toward a particular position?
 * The article was written in a neutral tone with more factual evidence as opposed to having a biased undertone.
 * Are there viewpoints that are overrepresented, or underrepresented?
 * The Organizational Structure section could have had a bit more information on certain employees higher up in the Bureau.
 * Check a few citations. Do the links work? Does the source support the claims in the article?
 * The links I checked all worked and the sources were legitimate and seemed to be reputable.
 * Is each fact referenced with an appropriate, reliable reference? Where does the information come from? Are these neutral sources? If biased, is that bias noted?
 * Each fact is referenced, but not always with a reliable reference. The information has come from a plethora of sources but there are a startling amount that come directly from the Census Bureau and it was not noted that it would be biased.
 * Is any information out of date? Is anything missing that could be added?
 * There is some information used that is from relatively long ago that could probably be updated with newer and better sourced information. I think specific people being named within the Bureau could be helpful or useful in the article.
 * Check out the Talk page of the article. What kinds of conversations, if any, are going on behind the scenes about how to represent this topic?
 * The Talk page has mentions of correct citations, the accuracy of the name of the Bureau, merging of two articles, and fact checking. There seems to be more discussion on how to adequately display the information on this topic as opposed to how to represent it. The representation seems to be rather agreed upon with mostly formatting and citing errors as a discussion.
 * How is the article rated? Is it a part of any WikiProjects?
 * The article is rated a C-class and is a part of the following WikiProjects: United States/Government, United States Public Policy, Economics, and Elections and Referendums.


 * How does the way Wikipedia discusses this topic differ from the way we've talked about it in class?
 * In class we have focused more on what the U.S. Census Bureau does as opposed to what it actually is. This topic is covered more from an organizational standpoint rather than focusing on its actual functions. The way we've talked about it in class has been neutral as is this article. I found I learned more about the history and legalities of the Bureau from this article than class.

Edits

I would like to add some information in the "Support" and "Opposition" sections using articles and journals relating to the coverage and uncertainty of the survey's data. I would also like to add a section or elaborate on the "Data Availability" section in regards to estimates and the possible errors they can cause. Using information from the American Community Survey can not always be reliable and I would like to expand on some of those reasonings.

''' I like the direction that you're going with this; however, it's unclear how the readings below are specifically going to inform these changes. These are great sources BUT what do you think these will bring to the article? - Prof H '''

Bibliography

Glenn, Ezra Haber, Estimates with Errors and Errors with Estimates: Using the R 'ACS' Package for Analysis of American Community Survey Data (May 12, 2015). Available at SSRN: https://ssrn.com/abstract=2590391 or http://dx.doi.org/10.2139/ssrn.2590391

Kinney, Satkartar K. and Karr, Alan, Public-Use vs. Restricted-Use: An Analysis Using the American Community Survey (February 1, 2017). US Census Bureau Center for Economic Studies Paper No. CES-WP-17-12. Available at SSRN: https://ssrn.com/abstract=2909935 or http://dx.doi.org/10.2139/ssrn.2909935

Luque, Adela, Renuka Bhaskar, Sonya Rastogi, and James Noon “Coverage And Agreement Of Administrative Records And 2010 American Community Survey Demographic Data.” OpenLuque, Adela et al. “Coverage and Agreement of Administrative Records and 2010 American Community Survey Demographic Data”. SocArXiv, 11 Jan. 2017. Web.

Seth E. Spielman & Alex Singleton (2015) Studying Neighborhoods Using Uncertain Data from the American Community Survey: A Contextual Approach, Annals of the Association of American Geographers, 105:5, 1003-1025, DOI: 10.1080/00045608.2015.1052335

Adding to American Community Survey

Data availability
The Census Bureau aggregates individual ACS responses (i.e. microdata) into estimates at many geographic summary levels. Among these summary levels are legal and administrative entities such as states, counties, cities, and congressional districts, as well as statistical entities such as metropolitan statistical areas, tracts, block groups, and census designated places. Estimates for census blocks are not available from ACS.

In order to balance geographic resolution, temporal frequency, statistical significance, and respondent privacy, ACS estimates released each year are aggregated from responses received in the previous calendar year or previous five calendar years. The Census Bureau provides guidance for data users about which data set to use when analyzing different population and geography sizes.

From 2007 to 2013, 3-year estimates were available for areas with 20,000 people or more. This data product was discontinued in 2015 due to budget cuts. The last 3-year release was the 2011-2013 ACS 3-year estimates.

Current data releases include:
 * 1-year estimates are available for areas with a population of at least 65,000 people. The 2015 ACS 1-year estimates were released in 2016 and summarize responses received in 2015 for all states but only 26% of counties due to the 65,000 minimum population threshold. This is most suitable for data users interested in shorter-term changes at medium to large geographic scales.
 * Supplemental estimates are shown in annual tables summarizing populations for geographies with populations of 20,000 or more.
 * 5-year estimates are available for areas down to the block group scale, on the order of 600 to 3000 people. The 2015 ACS 5-year estimates, summarizing data from 2011-2015, were released in 2016.

Within the last 10 years, the American Community Survey has collected and supplied all data at local levels. This was a large breakthrough in the survey because it allows the American people more individualized data on a community level as opposed to extrapolating from data collected over a larger area. It has also provided unparalleled information to be more accessible for local government planning and financing. The increase in data availability on a smaller scale is a necessary and welcome addition to the ACS. While the addition is welcome, it does not always accurately reflect a smaller population. Many conclusions for local data is averaged from various information across the area, and while useful, it is not always an adequate representation.

ACS estimates are available via a number of online data tools. American Fact Finder (AFF) is the primary tool for disseminating ACS data, allowing users to drill down to specific tables and geographies (starting with 2013 estimates, AFF also includes block group data). A selection of the most popular tables are shown in QuickFacts. Other tools include OnTheMap for Emergency Management, Census Business Builder and My Congressional District. My Tribal Area featuring 5-year estimates for federally recognized tribes, launched in 2017. The Summary File is the most detailed data source, and is available as a series of downloadable text files or through an application programming interface (API) for software developers.

Custom cross-tabulations of ACS questions can be made using the Public Use Microdata Sample (PUMS), freely accessible through the Census Bureau website and Integrated Public Use Microdata Series. PUMS data contain responses to every question from a sample of respondents. To protect respondent privacy, PUMS data are anonymized and only available down to areas containing 100,000 people or more known as Public Use Microdata Areas (PUMAs). The analysis of all ACS microdata without the sampling and anonymization in PUMS is restricted to qualified researchers at secure Federal Statistical Research Data Centers (FSRDCs).

Opposition
Opponents of the American Community Survey disagree with the court’s findings about its constitutionality. They believe the survey asks for more information, and at a higher frequency, than the simple enumeration required by Article 1, Section 2 of the U.S. Constitution. Despite the Government Accountability Office's conclusion that the Census Bureau has the authority to conduct the survey under and, several U.S. representatives have challenged the ACS as unauthorized by the Census Act and a violation of the Right to Financial Privacy Act. Rep. Ron Paul of Texas, who opposes the ACS, said of it that the founding fathers of the United States "never authorized the federal government to continuously survey the American people.”

Those who decline to complete the survey may receive visits to their homes from Census Bureau personnel. Because it is a mandatory survey, it is governed by federal laws that could impose a fine of as much as $5,000 for refusing to participate.

To date, no person has been prosecuted for refusing to answer the ACS. Former Director of the Census Bureau Kenneth Prewitt remarked that the Department of Commerce is "not an enforcement agency" and that "the Department of Justice would have to do the prosecution, and we don't recommend that." The Census Bureau prefers to gain cooperation by convincing respondents of the importance of participation, while acknowledging that the mandate improves response rates (and thus accuracy) and lowers the annual cost of survey administration by more than $90 million.

In 2014, the Census Project, a collaboration of pro-Census business and industry associations, gathered signatures from 96 national and local organizations urging the US House Committee on Oversight and Government Reform to reject a proposal to make the American Community Survey voluntary. Signers included the US Chamber of Commerce, the National Association of Realtors and the US Conference of Mayors. The letter cited results from a congressionally mandated test of a voluntary ACS that found that mail response rates would drop “dramatically,” by more than 20 percentage points. The resulting loss in quality and reliability would essentially eliminate data for 41 percent of U.S. counties, small cities, towns and villages, many school districts, neighborhoods, remote areas, and American Indian reservations.

Other opposition to the ACS has been in response to the accuracy, reliability, and accessibility of the data. The data collected, once released to the public, has been tampered with to keep confidentiality therefore potentially skewing the data a certain way. While the original data is possible to access, to do so one must have certain status, money, or both. Due to these potential inaccuracies in publicly available data, there is opposition to the reliability of using the ACS. The practicality of using this data comes into question once it has been changed, even if it is for privacy reasons. In regards to the accuracy of the data, there are some people not taken into account on the ACS. In particular, those in the LGBTQ community are not counted on the ACS and therefore certain resources are not able to be adequately distributed to them.

 You're on the right track with your additions to the article - but I would like to see you add in some additional info from your sources mentioned in the previous part of the assignment - Prof H