Talk:Second/Archive 1

Untitled
This page contains text taken from the public domain article at http://tycho.usno.navy.mil/leapsec.html

Gentlemen, Interesting article, will be useful to include a historical note about why exist 60 seconds in a minute? Milton 22:03, 28 Jan 2004 (UTC)

Now, I'm pretty sure a second is actually defined as the time it takes light to travel a specific fraction of a meter in a vacuum, I'll google and add it Thunderbolt16 22:17, Mar 20, 2004 (UTC)


 * No. However, a metre is defined as the length that light travels in a specific fraction of a second.  Morwen 22:19, Mar 20, 2004 (UTC)


 * Thanks, I misremembered. Thunderbolt16 —Preceding undated comment added 22:21, 20 March 2004 (UTC)

Not derived units?
Interesting... SI uses the term 'derived units', but apparently some other field does not? Which would then be the preferred term? See also SI derived units. Radiant_* 12:58, May 26, 2005 (UTC)


 * What are you talking about? In metrology jargon, a "derived unit" is a unit to measure some other quantity than the quantity measured by the "base units", and the "derived unit" is built up from some combination of those "base units".  For example, in SI the SI derived unit of the quantity force is the newton, which is built up from the base units for the quantities mass, length, and time as 1 kg·m/s².
 * The units formed by adding prefixes to the root word (note that this is not to the base unit, as we can see in the case of centimeters in cgs systems and kilograms in mks systems including SI) is something entirely different. Though they may be in some senses of the word "derived", they are not "derived units"&mdash;a term with more specific meaning in metrology jargon. Gene Nygaard 13:21, 26 May 2005 (UTC)


 * Okay, thanks for the explanation. Radiant_* 13:24, May 26, 2005 (UTC)

Multiples and submultiples section
Is this really appropriate? This is just some info on metric prefixes. Notthe9 07:29, 13 August 2005 (UTC)

Hundredth of a second
Is there a word for a hundredth of a second? --Revolución (talk) 23:29, 13 August 2005 (UTC)


 * Combine centi- (1/100) and second: centisecond or cs. &mdash; Joe Kress 02:19, August 14, 2005 (UTC)

Can we add this information as well as re-add the information about other ___seconds. Such as was: http://en.wikipedia.org/w/index.php?title=Second&diff=14261799&oldid=14261381? Hyacinth 09:59, 14 December 2005 (UTC)


 * Just because it can be formed doesn't mean that is either is used or should be used. Centimeters are okay for your hat size, and cubic centimeters for volume.  That's about the extent of the usefulness of any prefixes which are not powers of 1000.  We don't need to encourage use of centiseconds, nor any other new use of any of the prefixes centi-, deci-, deka- (even if you spell it deca-), or hecto-.  Gene Nygaard 12:54, 14 December 2005 (UTC)


 * What about nanosecond? Hyacinth 12:54, 14 December 2005 (UTC)


 * Use milliseconds (ms) instead of cs 82.94.1.175 10:42, 2 March 2006 (UTC)

Is it True and Is it Possible???/
The article says This definition refers to a caesium atom at rest at a temperature of 0 K As far as I know attaining this temperature is impossible,But can someone quote a source that this was indeed adopted as req. for standard def. of second.Holywarrior 12:28, 3 July 2006 (UTC)


 * It is part of the official BIPM definition of the second. All caesium clocks operate at room temperature. Measurements confirmed that the duration of the second varied depending on the ambient temperature surrounding the physical atomic clock. This sentence requires the duration of the second to be corrected, as if the clock was in an ambient temperature of 0 K. — Joe Kress 06:36, 4 July 2006 (UTC)

Table Needed
Much like other standard units of measurement, I believe a table for milliseconds, microseconds, nanoseconds etc. would be a useful addition. Of all the info on wikipedia, I was a little surprised to see no entry for nanosecond. Milliamp —Preceding undated comment added 10:56, 4 March 2007 (UTC)

History
The history section describes how certain ancients are known to have divided up the day but surely why they divided up the day way must be a matter of pure speculation. The section kind of implies that the ancients took a day and subdivided it and the result of that subdivision was the second. I suspect that it is coming to the concept from competely the wrong direction.

The thing I would say about the time division we commonly refer to as a second is that it is about the time to say a normal word at normal speed as in counting. In other words the time between one word and the second word is one second. As all humans can relate to speaking I would conject that the humans would have had the concept of a second long before anyone decided how many seconds there were in an average day and then devised the system for counting subdivisions of days into hours and minutes and seconds.

Etymologically speaking the word second comes from Old French but I am not sure if the word had both meanings in Old French as it still has in English. Deuxième is I think modern word for the ordinal whereas second is the time interval.--Tom (talk) 14:02, 26 January 2008 (UTC)


 * The second as we know it began during the Middle Ages and is simply the result of subdividing the hour sexagesimally. Previously, only the day had been subdivided sexagesimally. Your counting speed hypothesis would be prohibited original research, unless you can verify it in some published form. I doubt that any author has concluded that. The speed of counting is related to the speed that syllables, not words, are spoken. Small numbers usually have one syllable whereas large numbers are polysyllabic, so the speed they are spoken varies considerably. Monosyllabic counting words are spoken much faster than one per second. — Joe Kress (talk) 08:36, 27 January 2008 (UTC)


 * The issue I make is just a reasonable hypothesis. I don't have a reference but I am sure I am not the first one to have thought about the issue. The point is that I think it highly unlikely that the start point was a higher frame of reference, e.g. a day which was divided arbitrarily into 24th and 60ths twice to arrive at a set time span the second. Why 60ths? I doubt very much that the concept of this short and humanly meangingful time span came into existence in the Middle Ages! It is age old.  The second is an easy to reference for humans to reckon with, as are days and months. Whether its the time to say a word or the time to take one pace forward. Intermediate times are not. It is therefore much more likey that the divisors were created break the relatively high number of seconds in a day into meaningful chunks such as minutes and hours. Human scales for measuring things are perfectly normal such as the use of "hands" and "feet" in measuring lengths. In Finnish the word for "inch" is "tumma" (thumb) i.e. approximately the width of a man's thumb.--Tom (talk) 11:11, 27 January 2008 (UTC)


 * The smallest Babylonian unit of time was the barleycorn, 3⅓ seconds, hence much larger than our second. Even though the Babylonians originated the sexagesimal system some 5000 years ago, the barleycorn was not related sexagesimally to any other unit, so it developed independently. Whether or not it was related to a human measure I do not know, but the barleycorn was not the ancestor of our second. Consider the Hindu units of time mentioned in the second millennium BC vedas. They used ratios much smaller than 60. The human measure nearest in size to the second was the nimesha or blink which was about half a second. But the nimesha is not the ancestor of our second. — Joe Kress (talk) 07:05, 28 January 2008 (UTC)

What cannot be ignored here is the divisions of a degree. Far more practical than minutes & seconds of time would have been minutes & seconds of arc. I have seen articles that assert that temporal units were derived from the geometric/geographical/astronomical ones --JimWae (talk) 07:11, 28 January 2008 (UTC)

If just a little more speculation were tolerated here, then I would add: there must be some historical reason why the second is so close to the heart rate of a very healthy, very relaxed person. Get a ticking clock for your workspace (as I just did) and you'll see what I mean. Okay, okay, no original research, but c'mon, someone somewhere must have studied a connection. Bob Stein - VisiBone (talk) 18:33, 30 September 2008 (UTC)

IS there any evidence that these tiny units (halakim, barley-corn, and double-hour) were units of time & not (only) units of arc? The term double-hour seems strange too - its 4 minuts, but an hour was 60. In what base is 15 times smaller than something equal to double something? - --JimWae (talk) 06:28, 11 December 2008 (UTC) Looking again at "The Babylonians did not use the hour, but did use a double-hour": How could they use the term "double X" without using the term "X" ?? --JimWae (talk) 07:50, 11 December 2008 (UTC)


 * "Double-hour" for the Babylonian beru is a translation which makes it understandable to modern readers. Although the Babylonians did not use the equinoctial hour (our modern hour), a seasonal hour does occur in some Babylonian texts (I just read). The principal Babylonian unit of time was the uš (š is pronounced sh) which is translated as a time-degree because it is the time it takes for the Sun or a star to move 1° across the sky (1/360 of a nychthemeron, Greek for night+day), four modern minutes. Babylonian astronomical diaries gave observations of lunar eclipses in terms of uš or beru before or after sunrise or sunset. See Ancient and medieval values for the mean synodic month by Bernard R. Goldstein. The beru was 30 time-degrees or 1/12 of a nychthemeron, which is the time it takes a zodiacal sign (30°) to pass the observer's meridian, two modern hours. The cubit was 180 barleycorns or 10 time-degrees, 40 modern minutes. The finger was 1/12 of a time-degree or six barleycorns, 20 modern seconds. Thus the še (pronounced shay), translated as a "barleycorn", was 1/72 or a time-degree, 3⅓ modern seconds. See Why divide hours into 1080 parts? by Irv Bromberg. The most thorough discussion of these matters is by Otto Neugebauer, A history of ancient mathematical astronomy, 3 vols. 1975. However, the barleycorn was the Babylonian name for the smallest unit of any measurement, be it length, area, volume, weight, etc. See Old Babylonian Weights and Measures. — Joe Kress (talk) 21:46, 11 December 2008 (UTC)


 * well, now it starts to make sense (though 4 minutes still does not look like 2 hours). We should probably add some of this to the article--JimWae (talk) 06:25, 12 December 2008 (UTC)
 * The sentence I have been having trouble with most is:
 * The Babylonians did not use the hour, but did use a double-hour, a time-degree lasting four of our minutes, and a barleycorn lasting 3⅓ of our seconds
 * I have been reading this with an appositive, such as
 * The Babylonians did not use the hour, but did use a double-hour (a time-degree lasting four of our minutes), and a barleycorn lasting 3⅓ of our seconds
 * taking "a time-degree lasting four of our minutes" as an explanation of what a "double-hour" is, rather than a continuation of a list (which DOES end with an explanation of a barleycorn).
 * The "time-degree" is what needs unpacking to make this clearer --JimWae (talk) 08:23, 12 December 2008 (UTC)
 * I clarified the double-hour. — Joe Kress (talk) 18:11, 26 January 2009 (UTC)

Day
"...such as the minute, hour, and day increase by multiples of 60 and 24" -- the day article uses the phrase "approximately" here 78.86.37.93 (talk) 00:06, 31 May 2009 (UTC)


 * I am removing "day" from that phrase because the day does not increase by either 60 or 24 as the minute and hour do. Furthermore, as a Julian day it is used in a base ten positional number, which inherently uses powers of ten. Furthermore, "approximately" in day is correct only if the atomic or astronomical day is regarded as the only possible day. — Joe Kress (talk) 02:56, 31 May 2009 (UTC)

3 1/3
What does 3 1/3 mean in the article?--Ssola (talk) 22:09, 23 October 2009 (UTC)


 * The barleycorn (3 1/3 seconds) is the smallest unit of time used by the Babylonians, emphasizing that they did not use the second, contrary to the statements of some. — Joe Kress (talk) 23:28, 23 October 2009 (UTC)

8640 x 10 = 86400
Multiplicating of each subsequent UNIQUE number from composite number sequence: http://www.research.att.com/~njas/sequences/b003418.txt as follows:

1x2x6x12x60

thus stopping before first three-digit number 420 results in 8640, giving value exactly ten times smaller than 86400. That shows that alternate divisions of day without altering the SI second are possible. That can be mentioned in article. 83.22.141.99 (talk) 18:55, 12 January 2010 (UTC)


 * Unsourced original research (click these links) cannot be mentioned in an article. Please have a look at your talk page, where you will find a few interesting things about how this Encyclopedia works. Cheers and enjoy. DVdm (talk) 19:33, 12 January 2010 (UTC)

Classical or quantum vacuum?

 * Copied from Talk:Speed of light:

I just looked at the article for Second, and it appears there is a distinction being made there between quantum vacuum and free space, and the implication that the definition of both the meter and the second is based on the latter, said there to be an unattainable abstraction, which I find surprising. This seems a bit like the controversy that caused such a ruckus here in Talk:speed of light, if I am understanding right. When I read the SI specification document, I don't see any mention of free space in either definition, but perhaps I am missing something. CosineKitty (talk) 21:47, 28 January 2010 (UTC)
 * The difference is it's defined in free space but that's impossible to achieve on Earth so the next best thing is used. This is less of an issue than it seems as the thing being measured was chosen as its a very stable transition, so even in less than perfect conditions it gives the right result (though I can't remember where I read this).-- JohnBlackburne wordsdeeds 22:12, 28 January 2010 (UTC)


 * Interesting. It would make sense that those smart people would pick something that's easy to reproduce &mdash; at least, easy for crackerjack physicists!  :)  But I'm still skeptical about the claim being made in the article Second.  They just don't say that, at least not explicitly.  The SI specification (the one I linked above) defines a meter based on "light in a vacuum", and for the definition of a second, it just refers to a caesium 133 atom at 0K.  One presumes that 0K is unattainable, but it is approachable as a limit, in which case there are no other atoms colliding with the caesium atom, and thus it is in a de facto vacuum.  In either case, I don't see how vaccum means anything other than a literal laboratory container with (almost) all of the air pumped out of it.  The gravitation in the laboratory distorts spacetime a little bit (as seen from someone far from Earth), but we assume that distance and time durations are both relative to an observer in the same frame.  You would want to measure your distance and time standards horizontally, if done on Earth, to minimize the effects of climbing out of (or into) the gravity well.  But once you made your calibrated meter-stick and second-clock, I would assume you could take them out into intergalactic space and use them, and they would work just as well as ones that were made in an intergalactic laboratory. CosineKitty (talk) 22:55, 28 January 2010 (UTC)


 * Don't worry about relativistic effects. As long as the Caesium and the observer are in the same frame of reference they can be ignored. E.g. you set up your Caesium, measure it, set your watch by it and you have an accurate watch. You take it to some other frame of reference it will still be an accurate watch and you'll still be able to tell the time and measure distances well. It may run slow if observed by someone in another frame of reference, due to time dilation, but the watch is still accurate.-- JohnBlackburne wordsdeeds 23:13, 28 January 2010 (UTC)


 * Yes, I think we are saying the same thing about relativistic effects. I'm not worried about them.  I may have gotten out in the weeds there; my real point was that the cited source talks about a vacuum, and there is no hint there that they mean anything other than an honest-to-goodness real-world container with as much air pumped out of it as possible, not some unattainable abstraction as free space.  Am I right or wrong (or just confused)?  CosineKitty (talk) 23:22, 28 January 2010 (UTC)
 * Well, they mean at least slightly more by vacuum. In the context of these definitions it is also meant to imply the absence of any background fields. TimothyRias (talk) 09:18, 29 January 2010 (UTC)
 * Yes, but the Second article seems to imply (but comes short of explicitly saying) that the definition is supposed to apply to (classical) free space rather than (quantum) vacuum so that you have to correct it for Lamb shift and similar. That sounds strange to me... I thought it just meant "no particles, including photons". Anyway this belongs to Talk:Second not here. ― A._di_M.2nd Dramaout (formerly Army1987) 11:30, 29 January 2010 (UTC)
 * End of copied discussion.

Wikipedia is about verifiability (WP:VERIFY), not truth. It is up to the people at BIPM to persue truth. Wikipedia merely reports what they say. I happen to know that they are very concerned with "Can we measure it?". This is manifest in the defintion of the kilogram - philosophically speaking, defining the kilogram in terms of a Carbon-12 atom is better that using a hunk of metal that is locked away on international territory just outside Paris. However there is a practical problem with counting several zillion atoms of carbon to make up one kilogram, so they still use the hunk of metal (maybe until 2011 when they might switch to a Hall Effect force-balancing method). I believe it to be the same when measuring the speed of light - they are using the "best" vaccuum that can be achieved using practical means. —Preceding unsigned comment added by Martinvl (talk • contribs) 12:04, 29 January 2010 (UTC)
 * That is actually, why I'm surprised by the fact that this article claims that in realizing the second one should correct for the Lamb shift of the energy levels. Since this shift is equal for any Caesium atom and the magnitude of the shift is actually quite hard to determine exactly, it'd make much more sense to define the second in terms of the actual energy difference between the hyperfine levels including the lamb shift and any potentially unknown other corrections, since that is the thing you can measure in a lab. The current ref for this claim only supports the fact that the lamb shift exists as far as I can tell, and those not actually support the claim that in realizing the second one should correct for the lamb shift. I'd like to see an actual ref for that claim.TimothyRias (talk) 13:32, 29 January 2010 (UTC)
 * BTW, I was just thinking that if there were no quantum vacuum, there would be no transition either: the upper of those two levels would be an eigenstate of the Hamiltonian and an isolated atom would just stay in that state forever. So the definition which assumes classical free space rather than quantum vacuum makes no sense. (And I'd like a reliable source using contrastively "free space" to mean the classical one and "vacuum" for the quantum one, too.) ― A._di_M.2nd Dramaout (formerly Army1987) 17:07, 29 January 2010 (UTC)


 * Is everyone OK with a simplification of wording here? I don't think we need to go off on confusing (and unsupported by sources) tangents about vacuum.  I think that entire paragraph (the one starting "In practice, the transition is measured ...") can safely be removed from the article.  In fact, that paragraph makes a distinction between allegedly theoretical "free space" and actual "quantum vacuum", but this contradicts a quote later in the article saying "interrogation of neutral atom based optical standards has been carried out primarily in free space".  In the same article we appear to be using the same phrase "free space" to mean contradictory things.  CosineKitty (talk) 20:56, 29 January 2010 (UTC)


 * Clarification on my previous post: the paragraph does have a reference (labeled [4]) but there is no evidence that this has anything to do with the SI definition of the second; that is what I mean by "unsupported by sources". I want to be careful when I advocate the dreaded "removal of cited information"!  CosineKitty (talk) 21:06, 29 January 2010 (UTC)


 * I've read this whole discussion and agree with CosineKitty and A. di M. We should remove the paragraph in question. Not only is it making unreferenced (not to mention incorrect) claims, it's creating confusion where there should be none. Virtually no reader would ever wonder whether or not quantum vacuum corrections should be included. If we just say "caesium atom" people will intuitively understand it as a real caesium atom, and not a hypothetical semiclassical caesium atom. This intuitive understanding is the correct understanding, so there is no need to draw attention to it. :-)


 * I also added that the electric field is 0 in the definition of the hyperfine splitting. I'm sure there's a small stark shift at some order of perturbation theory. --Steve (talk) 02:19, 30 January 2010 (UTC)

(redent) The paragraph has gone now, so there's nothing to worry about. I found this link from the UK National Physical Laboratory which strongly implies that no correction is made to the observed frequency for the Lamb shift, and indeed that precise measurement of the Lamb shift is an ongoing topic of metrology research. The 1997 addendum to the definition of the second makes it clear that there are two corrections to be made to the measured frequency: one for Lorentz time dilation (the caesium atoms are not at rest relative to the observer in a caesium fountain atomic clock) and the second for the perturbation due to black body radiation. I don't know if people make a correction for the Stark shift: possibly this was too obvious (or, alternatively, too small) to be relevant in the definition. Physchim62 (talk) 11:36, 30 January 2010 (UTC)

Second elsewhere than on the geoid
I've just noticed that the article states that the SI second is defined as the second on the geoid: that's not true! The SI second is the unit of proper time as defined by the caesium transition wherever you measure it. The second as corrected for gravitational time dilation to the geoid is the TAI second. I'll see if I can fix this, but I'll note it here in case people have any comments (or in case I'm making a huge blunder somewhere!) Physchim62 (talk) 11:36, 30 January 2010 (UTC)


 * Yes, as you said, the geoid specification is not talking about the "second" as a unit of measurement, it's talking about the "second" as a designation in International Atomic Time.


 * For other people unfamiliar with this: Two initially-synchronized clocks at different elevations will eventually get out of sync due to time dilation, so the geoid is a convention that says which clock we should use for the train schedule. The keepers of the atomic clocks that control international atomic time have all measured their absolute elevations very carefully. --Steve (talk) 17:44, 30 January 2010 (UTC)

Caesium-133 article not helpful, nor is the link to it
Someone recently added a link to caesium-133, which I find singularly unhelpful compared to the existing links to caesium. Take a look at both of those and see if there is any justification for a dedicated article about the only stable isotope of this element, and if there is anything in the former article that is not far better covered in the latter. I removed the link, and I am supporting merging caesium-133 into caesium. CosineKitty (talk) 00:41, 18 March 2010 (UTC)
 * The place to complain about caesium-133 is really WP:ELEMENT. I agree that the current article is pretty naff, but I think there is room for improvement rather than simply merging. As for editors on this article, are there any gory details about the caesium transition which would be better treated in an article on Cs-133 than here? Physchim62 (talk) 00:58, 18 March 2010 (UTC)

Day and Earth's Rotation
The article defines the second as 1/86400 of the rotational period of the earth. The second is rather 1/86400 of the average length of a day (noon to noon). This is closely related but not exact. In a (julian) year of 365.2425 days, the earth rotates 366.2425 times which makes the rotational period about 23 hours 56 minutes. This is because earth also rotates around the sun and it advances a little less than one degree a day. Earth has to rotate for 361 degrees that the sun is over the same meridian again and it takes about 4 minutes to rotate the additional degree. As an additional complication, the angle earth advances each day varies over the year, because earth is in an elliptic orbit and rotates faster when closer to the sun. That makes the time from noon to noon much less stable than the rotational period of Earth. Therfore the length of the day has to be averaged over one year to define a stable day on which the second was based Sconden (talk) 17:00, 22 June 2010 (UTC)
 * I've rewritten it to make more sense. You're right - the earths rotates over the sidereal day, while time is measured based on the solar day. Rather than explain all that just I've corrected it, added some more info I thought valuable, and provided a link to solar day for those interested in more detail.-- JohnBlackburne wordsdeeds 17:26, 22 June 2010 (UTC)

Misuse of sources
is one of the main contributors to Wikipedia (over 67,000 edits), and practically all of his edits have to do with Islamic science, technology and philosophy. This editor has persistently misused sources here over several years. This editor's contributions are always well provided with citations, but examination of these sources often reveals either a blatant misrepresentation of those sources or a selective interpretation, going beyond any reasonable interpretation of the authors' intent. Please see: Requests for comment/Jagged 85. That's an old and archived RfC. The point is still valid though, and his contribs need to be doublechecked. I searched the page history, and found one edit by Jagged 85 in July 2008. Tobby72 (talk) 22:59, 18 June 2010 (UTC)


 * The edit by Jagged 85 was correct as far as it went. Taqi al-Din actually stated "We constructed a mechanical clock with three dials which show the hours, the minutes, and the seconds. We divided each minute into five seconds." Jagged 85 omitted the last sentence which meant that Taqi al-Din divided each minute on his clock into twelve five-second periods. Although he did not divide each minute into 60 one-second periods, his clock did indeed "show seconds", which is all the Swiss clock citation stated. I don't know whether that Swiss clock marked every second, or only some, because its citation is quite inferior to the Taqi al-Din citation. — Joe Kress (talk) 02:57, 22 June 2010 (UTC)


 * Despite what Taqi SAID (and what the clock actually did), it is misleading for us to flatly say the clock showed "seconds" in an article about the second. The clock showed 1/12ths of a minute (or intervals of 5 seconds, or 12 equal subdivisions of a minute). IThis is really more relevant to the clock article than this one--JimWae (talk) 06:16, 23 June 2010 (UTC)


 * The earliest historical use of the second, both written and mechanical, and its historical size are appropriate subjects for this article. The "second" meant both 1/3600 hour and 1/3600 day throughout medieval and early modern astronomy (probably 830–1740). I cited both al-Biruni (1000) and Roger Bacon (1267) who used both meanings, but both may have been used as early as 830 by al-Khwarizmi and both were used as late as 1740 by Jacques Cassini. Modern seconds (seoonds of hours) were not used by the Hellenistic astronomers Hipparchus (150 BC), Ptolemy (140 AD) and Theon (360) although all of them used seconds of days. All of them also used the modern hour, but did not subdivide it sexagesimally. No other seconds have been attested, certainly not a period of 12 modern seconds (1/5 minute). Babylonian astronomers first subdivided the day sexagesimally into minutes, seconds, thirds, fourths, fifths, and sixths (using Engish terms equivalent to medieval Latin terms). This second sexagesimal fraction of a day was 24 modern seconds. Thus medieval astronomers used two different "seconds" within the same book, and even on the same page. Because the English translation of Taqi al-Din used "five seconds" (or "five-second [intervals]") alongside hours, the seconds dial was subdivided into twelve five-second periods that I assume were marked 5, 10, 15, etc., even though individual seconds were not marked. This numbering is not in the article because it is only my opinion. Taqi al-Din was quite familiar with Western European time units and their numbering system because he complained that cheap Western European clocks were being imported into Istanbul. — Joe Kress (talk) 01:10, 27 June 2010 (UTC)


 * Do we know if the movement from one second marker on the dial to the next was smooth or pulsed? If the hand jumped, then saying the clock showed seconds would be quite misleading--JimWae (talk) 03:42, 27 June 2010 (UTC)


 * It was probably smooth because in an earlier 1559 publication he described a clock that also had three dials which displayed "hours, degrees and minutes" (probably mistranslated). Although it had a verge escapement that "ticked" every 2.38 seconds, the pulses controlled the hour dial by a gear train which would have smoothed them. Furthermore, the other two dials were sequentially driven by gears from the hour dial, smoothing it even more. See figure 18 of The astronomical clock of Taqi al-Din: Virtual reconstruction. — Joe Kress (talk) 07:18, 28 June 2010 (UTC)

Abbreviations for 'second'
I was wondering if there was the need for a short passage on the abbreviations for second (i.e. s, sec, ") because articles from these pages link to here. GoldenTie 10:53, Christmas Eve 2005 (GMT)

agree something about " as abbrv should be added - this is one of those things that is so common that it is never noted —Preceding unsigned comment added by 108.7.165.143 (talk) 15:18, 8 October 2010 (UTC)


 * Wikipedia policy is to place redirects to a specific page in bold so that anyone who follows such a link knows that s/he has arrived at the main page. Thus I've put the symbol s in bold and added the informal abbreviation sec. " redirects to arcsecond, not to second. — Joe Kress 04:10, 26 December 2005 (UTC)

Relativity
I'm by no means an expert in this area, so I may be missing something obvious. The definition of a second seems to be ignoring issues regarding relativity as discussed in the GPS article. As noted there, a cesium clock in orbit will exhibit different behavior than one on the ground. Which is considered "correct" by the standard? The article makes mention of a cesium atom at rest, but at rest compared to what? —Preceding unsigned comment added by 71.146.43.85 (talk)


 * The article is indeed missing a crucial definition. The SI second has been defined on the rotating geoid (mean sea level) since the beginning of 1977. This was more precisely defined in 2000. The exact wording of this in the article remains to be determined. — Joe Kress 23:59, 9 August 2007 (UTC)


 * I think this a wrong interpretation. The definition of the SI second does not depend on whether the clock rests at sea level or moves in space. But each clock measures its own proper time. So if they show different values that is because they measure different time scales. It is the definition of the time scale International Atomic Time (TAI) whichs refers to sea level. --Trigamma (talk) 22:56, 4 January 2011 (UTC)


 * Indeed, the NIST definition of the second sort of explictly does not mention the circumstances of whether the clock rests at sea level or moves in space or whatever. This definition is part of the very essence of relativity. For Newton time remained vague and undefined, whereas in modern physics time is unambiguously defined as what a clock reads. Every clock is considered to be "correct" in measuring its own time. Time of different clocks under different circumstances can be compared, and we have a theory that is very well capable of relating one clock's time to another clock's. DVdm (talk) 23:18, 4 January 2011 (UTC)

Abbreviations
I have removed the abbreviations from the lede. I do not believe that the annreviation "sec" is a "common" abbreviation (as per WP:LEDE) other than in the spoken word (and is therefore probably slang). The double prime is used for seconds of arc. Martinvl (talk) 08:09, 24 March 2012 (UTC)


 * Perhaps not as common as it used to be, but one still sees "deg. min. sec." and "hr. min. sec."     D b f i r s   08:49, 24 March 2012 (UTC)


 * I do believe "sec" remains common in the US. It took me moments to find it in a range of US standards from ASME and API. A quick online search for "gravity 32 ft" throws up many instances of "ft/sec2. It's appropriate to keep it in the lede. However the current phrasing of the first sentence of the lede is quite narrow and technical; it discusses the second as an SI unit only and neglects its existence in US customary usage and the general history of time-keeping. I'll attempt a clarification. NebY (talk) 09:24, 24 March 2012 (UTC)


 * I'm removing some of User:JimWae's additions to the lede, which I should explain. I think we should try to keep the lede relatively short and direct without actually making it misleading. t's not the place to go into a lot of technical detail about how the second is now defined or just what measurement systems use the abbreviation "sec"; we don't the reader's eyes to glaze over just yet. Plenty of time for that later. JimWae queries "fundamental" and "other systems". I used the term "fundamental" in the sense that it is one of the building-blocks of other units. We could link to Fundamental unit but I'm wary of overlinking, especially in the lede. Another system to which the second is fundamental - perhaps the main current one - is the Foot-pound-second system, the formalisation of United States customary units for scientific, engineering and other technical uses and likewise of Imperial units. The second also sat at the heart of other Systems of measurement especially predecessors of SI. Hope this helps - sorry to be long-winded. NebY (talk) 11:19, 24 March 2012 (UTC)


 * I looked at the Corpus of American English and found 66,013 uses of "s" as a word, compared to 8,335 for "sec". The search is not case sensitive, and does not consider punctuation, but does check that it is a word, not letter(s) in a word. I browsed the first 100 hits for each. None of the "s" hits were related to the second; they were mostly people's middle initial. Extrapolating to all the hits, I would expect fewer than 660 to refer to the second. All the "sec" hits appeared to be abbreviations of "second" in a somewhat informal context, like "USB 2.0 was fastest at only 9.6 sec, compared with 19.9 and 14.3 sec for USB 3.0 HDD and SSD, respectively" from Popular Mechanics.


 * So it's possible that in a combination of formal and informal writing, "sec" may be used much more as an abbreviation for second than "s". Jc3s5h (talk) 13:38, 24 March 2012 (UTC)

Current definition
I think it is not a good idea to not have the current definition appear until the 3rd paragraph. WP:LEDE seems to make the same point.--JimWae (talk) 19:16, 24 March 2012 (UTC)


 * But the SI definition is not the only current definition. It is the de facto standard for time intervals and civil timekeeping, but astronomers still use the definition 1/86,400 of a UT1 day for many purposes. Some countries have not legally adopted UTC. Many residents of those countries neither engage in precision time keeping nor understand the difference between UTC and UT1; they just set their wrist watch when it seems too far off. So which definition of the second are those ordinary folks observing? Who can say?


 * In the recent dust-up over leap seconds at the World Radio Conference in January, a decision was postponed about abolishing leap seconds until 2015, because social and legal factors had not been fully studied. Jc3s5h (talk) 19:34, 24 March 2012 (UTC)


 * IN that case, readers should not have to read right to the very end of the article to find out there are 2 currently used, distinct definitions. Btw, there's a positive leap second at the end of June.--JimWae (talk) 20:21, 24 March 2012 (UTC)


 * Good point. Should we recast the lede from the opening sentence? We have, jostling for attention, that
 * it's a unit of time in many systems, with a long history
 * it's an SI unit with one definition and abbreviation/symbol
 * it has other definitions and abbreviations.
 * I'm sorry to say I'm failing to pack that into one accessible sentence.NebY (talk) 22:22, 24 March 2012 (UTC)


 * But it can be done in one accessible paragraph - and that is in accordance with WP:LEDE--JimWae (talk) 00:59, 25 March 2012 (UTC)

There is a problem with the phrase "Between 1000 and 1960 the second was defined as 1/86,400 of a mean solar day" that Joe Kress introduced today. I'm pretty sure that in 1000 clocks were not good enough to distinguish between a mean solar day and an apparent solar day. The mean solar day came into use when good clocks came into use, which I suppose was the 18th century. Jc3s5h (talk) 13:24, 25 March 2012 (UTC)


 * Yes, I was puzzled about why the exact date of 1000 was chosen. Should we change it to "from the 18th century to 1960 the second was defined ..."?    D b f i r s   14:38, 25 March 2012 (UTC)


 * I was also puzzled by the choice of 1000 AD. May I suggest that write "Until 1960 ..." giving no comment regardign the start date (unless of course somebody has a citation for the start date!). Martinvl (talk) 15:36, 25 March 2012 (UTC)


 * 1000 is the year that al-Biruni gave times in hours, minutes, seconds, thirds and fourths (yes, he wrote during that precise year). In 1267 Roger Bacon also gave times in hours, minutes, seconds, thirds and fourths. Both citations are given in "Before mechanical clocks", and a dup ref could be given in the lead. Seconds were definitely not used by either Ptolemy about 140 nor by Theon about 380 so I don't like the open-ended "until 1960". Al-Biruni is the earliest scholar known to have used our modern seconds, so the year he wrote should be acceptable. — Joe Kress (talk) 17:29, 25 March 2012 (UTC)


 * GIven that 1000 is a round number, it might be appropriate then to write "Between 1000 (as reported by Al-Biruni) and 1960 ...". This will give the reader more confidence that this is not just a number plucked from the air.  Moreover it will emphasise the Arab connection - at that time the Arab world was the centre of mathematical knowledge. Martinvl (talk) 17:52, 25 March 2012 (UTC)


 * Done, although I changed the notice to "when al-Biruni used seconds", and I also added the dup ref. — Joe Kress (talk) 18:17, 25 March 2012 (UTC)


 * By the way, Ptolemy described both the mean solar day and the true or apparent solar day and their difference, the equation of time. Ptolemy was well aware of the two components of the equation of time, the Sun's eccentric anomaly and the obliquity of the ecliptic, so he calculated the mean solar day from the observed apparent solar day — he could not measure the mean solar day. Since that time all astronomers like al-Biruni used mean solar days when they wanted to be precise. See Ptolemy's Almagest, translated by G. J. Toomer, pp 23 & 169–172. — Joe Kress (talk) 19:54, 25 March 2012 (UTC)


 * I guess the unstated corollary is that since seconds are not detectable on sundials, there is no such thing as a second of apparent time. And yet, The Nautical Almanac claims that British publication used apparent solar time until 1833; navigators would have had second hands on their chronometers at that point, wouldn't they? So maybe seconds of apparent solar time did exist. Jc3s5h (talk) 23:29, 25 March 2012 (UTC)

Questionable sentence about definition
This edit by User:RoyGoldsmith modified a sentence to read "Thus the second became firmly defined as one-sixtieth of a minute or 1/86,400th of a day." in the context of good-quality pendulum clocks. But an excellent 17th century clock kept mean solar time. The added sentence does not specify what kind of day, a mean day or an apparent day. Jc3s5h (talk) 14:08, 24 May 2012 (UTC)


 * That phrase is false. The second was defined by Muslim scholars as a subdivision of the mean solar day, not the apparent solar day, hundreds of years earlier, long before it could be reliably measured — a device was not needed to "firmly define" the second. — Joe Kress (talk) 19:12, 24 May 2012 (UTC)


 * Perhaps what the editor was trying to convey was that about the time clocks were equipped with second hands, they (at least the good ones) became accurate enough that the time displayed by the clock was substantially closer to mean time than apparent time. Earlier clocks were so inaccurate it didn't matter whether you set them with a sundial or by observing stars and converting sidereal time to mean time. Jc3s5h (talk) 19:18, 24 May 2012 (UTC)


 * I've changed the sentence to: "Thus the second could now be reliably measured." This avoids any definition of the second. — Joe Kress (talk) 23:55, 24 May 2012 (UTC)


 * What I was trying to get at was what happened in the time between the establishment of pendulum clocks and the definition for the first time of the second as the base unit of time. This would have occurred sometime between the end of the 15th century and 1875 (the Metre Convention) or even later. I read through the articles on SI, the History of the metric system and various other material and I still can't find when SI or anyone of its predecessors formalized time or how it was decided to use the second as the unit of measurement. This is certainly a fit topic for the Wikipedia article on the Second &mdash; look at the lead paragraph.


 * I doubt (but am ready to be convinced by a reliable source) that just because al-Biruni used seconds 800 years in the past, this was a sufficient reason for what was basically, in those times, a European standard. I couldn't find any sources that said, for example, that Britain's supremacy at sea in the 17th and 18th centuries led to the division of almost all European nautical charts by degrees, minutes and seconds (based on the British longcase clock) and this, in turn, led to the second being chosen as the base unit of time. Does any one know how the second got from grandfather clocks to SI?


 * I concede that the quote in the first sentence of this section and the phrase "and other clockmakers soon followed" that immediately precedes it could be viewed as WP:OR. If anyone wishes to remove them, I have no objections. --RoyGoldsmith (talk) 13:04, 25 May 2012 (UTC)


 * I just added a more detailed account of recent history of the second to Leap second, which I'll amplify and add here. Al-Biruni's usage was adopted by European scholars as early as 1267 by Roger Bacon as already stated in the article at Second. However, for several centuries, the second was only a subdivision of the mean solar day, which was the fundamental unit of time in all European astronomical tables. — Joe Kress (talk) 00:43, 26 May 2012 (UTC)


 * The section at Leap second states "...the second was adopted around 1861 as the base unit of time in an early version of the metric system." Does anyone have a reference for that year, possibly explaining what body adopted the second (perhaps BAAS?) and why they used a word from 800 years in the past, rather than more modern terminology, like those used by Tycho or Johannes Hevelius? --RoyGoldsmith (talk) 13:00, 26 May 2012 (UTC)


 * The British Association for the Advancement of Science (BAAS) proposed the centimetre-gramme-second (CGS) system of units in 1874 after a series of meetings beginning in 1861. Meetings during the 1860s are reported in Reports of the committee on electrical standards (1873), most of which discusses electrical units, but page 61 states "The unit of time adopted in all physical researches is one second of mean solar time." and page 90 states "All men of science are agreed to use the second of mean solar time as the unit of time." Both quotes are in an appendix to the first report. The BAAS advocated its system in Illustrations of the centimetre-gramme-second (C.G.S.) system of units by J. D. Everett (1875), which includes the 1874 reports as an appendix beginning on page 83. The second as the unit of time appears on pages 10 and 84. Everett added several other fields where CGS units could be used in later editions, up to the fifth edition in 1902, Illustrations of the C.G.S. system of units. The seoond appears on pages 19 and 273 of this edition.


 * I am confused about your question on terminology, so I'll quess you are refering to the English word "second". Obviously, the BAAS used the appropriate English word. Al-Biruni wrote in Arabic so I assume he used terms virtually the same as the modern Arabic terms for hour, minute and second (saa'a, daqiiqa and thaaniya). The al-Biruni citation is to the 1879 English translation which used the English word second. The New Shorter Oxford English Dictionary states "second" as a unit of time entered the English language during late 16th century, and is derived from old French seconde from medieval Latin secunda used as a noun (compare minuta minute), the feminine form of the Latin adjective secundus, the ordinal number word for "second" (being the result of the second operation of dividing by sixty). It also states that a "minute" of time entered English during the late Middle English period, derived from medieval Latin pars minuta prima first minute part, the 1/60 of a unit in a system of sexagesimal fractions. Roger Bacon in 1267 used hora (hour), minuta and secunda because he wrote in Latin. All later Latin writers, including Tycho, Kepler, etc., would have used the same terms. — Joe Kress (talk) 18:58, 27 May 2012 (UTC)


 * According to the SI brochure, the second first became part of the metric system in 1832 when Gauss used it - See History of the metric system for full reference. — Preceding unsigned comment added by Martinvl (talk • contribs) 23:17, 27 May 2012


 * Thanks for the cite, but according to the SI brochure the second did not become part of the metric system in 1832, instead Gauss "strongly promoted" the metric system with the second from astronomy as a coherent system of units for use in the physical sciences. This indicates that the second was already used in astronomy as a basic unit. The second replaced or began to replace the mean solar day as the basic unit of time in 1670 when William Clement began to regulate clocks with a seconds pendulum, which had a period of two seconds, one second to swing forward and another second to swing back. The resulting longcase or grandfather clock was able to tick every second with good accuracy. This is already mentioned in the article. At the same time, the Royal Observatory, Greenwich installed two Thomas Tompion clocks with pendulums 3.96 meters long and a peirod of four seonds in order to achieve much better accuracy. The last pendulum clocks to be used by observatories (1920s to 1940s) were Shortt clocks which also had a seconds pendulum. — Joe Kress (talk) 07:18, 28 May 2012 (UTC)

Edit by JimWae
Jim Wae made an edit which changed "This definition abandoned any explicit relationship between the scientific second and the length of a day, as most people understand the term."

to

"There are 31,556,925.9747 seconds in 365.24220 24-hour days."

The point of the existing version was a change in the nature of the definition. The definition had been based on mean solar time, which in practice was found by recording the time when stars crossed the observing plane of a meridian circle, and converting the observed sidereal time to mean solar time according to an equation derived by Simon Newcomb. Any changes in the rotation rate of the Earth would be reflected in mean solar time.

The new definition essentially means that the second is the length it needs to be to make Newcomb's tables for the positions of the planets and the Moon come out right. So it was measured by observing the position of the planets, and especially, the position of the Moon. The rotation of the Earth became irrelevant. However, this definition was not very practical; the calculation of the results meant that time, Ephemeris Time, was not available for months after the observations. That explains why this definition was quickly abandoned in favor of atomic clocks, which can be used in real time. Jc3s5h (talk) 00:55, 6 June 2012 (UTC)

Flashing GIF
Sorry, but I find the flashing GIF illustration very annoying - would it be possible to remove it, or at least to have it switched off by default with a button to switch it on if desired? — Preceding unsigned comment added by Samsite (talk • contribs) 12:31, 14 June 2011 (UTC)


 * I'm OK just deleting it. And I made it!! Not sure about other options. I know OGVs appear with a play button underneath them. But I don't think I have any OGV-creating software. (I could call for help from A/V-skilled wikipedia people.) Or, can just change the image to something less annoying that still illustrates "second". A spinning dial? Even just decreasing the light-dark contrast in the flash might lower the annoyance factor.
 * I only spent a minute or two making this animation, I figured it was (maybe) better than nothing but there's plenty room for improvement!! --Steve (talk) 16:57, 14 June 2011 (UTC)
 * For now I just shrunk it, which I think helps a little bit. --Steve (talk) 17:04, 14 June 2011 (UTC)


 * I also find it very annoying. - Emil — Preceding unsigned comment added by 88.88.94.189 (talk) 20:54, 14 August 2012 (UTC)


 * I changed the colors to gray (it may take a day to refresh though). I think it helps, do you agree? It could also be made even smaller. --Steve (talk) 21:27, 14 August 2012 (UTC)

Definition of Thirds, Fourths, etc
In the article:

Keeping Time: Why 60 Minutes? http://www.livescience.com/44964-why-60-minutes-in-an-hour.html

There is a quote:

"The 11th-century Persian scholar Al-Bīrūnī tabulated times of new moons on specific dates in hours, 60ths (minutes), 60ths of 60ths (seconds), 60ths of 60ths of 60ths (thirds), and 60ths of 60ths of 60ths of 60ths (fourths)."

that seems to suggest a different definition of "thirds" and "fourths".

Which is correct?--Filll (talk | wpc ) 00:34, 25 April 2014 (UTC)

sp "seond"
used in excess of 35 x in WP articles.GinAndChronically (talk) 05:42, 26 June 2014 (UTC)

Mix up between 'second' and 'year' in introduction?
The sentences below appear to mix up year and second definitions.

"The use of the word second in English began in the late 16th century. The definition remained unchanged (and still applies in some astronomical and legal contexts)[4][5] from 1000 until 1960, at which time it was defined as 'the period of the Earth's orbit around the sun in the year 1900'.[6]"

Not sure if the intent is to suggest that the 'year' upon which the second was defined changed, or that in 1960 the definition of a second was changed to something besides a fraction of a year. In any event, a year is not a second. — Preceding unsigned comment added by Teleksterling (talk • contribs) 22:18, 22 November 2015 (UTC)


 * I checked the cited source; at some point the quote was damaged so it no longer matched the source. I fixed it. Jc3s5h (talk) 00:00, 23 November 2015 (UTC)

External links modified
Hello fellow Wikipedians,

I have just added archive links to 1 one external link on Second. Please take a moment to review my edit. If necessary, add after the link to keep me from modifying it. Alternatively, you can add to keep me off the page altogether. I made the following changes:
 * Added archive http://web.archive.org/web/20080314011450/http://inms-ienm.nrc-cnrc.gc.ca:80/research/optical_frequency_projects_e.html to http://inms-ienm.nrc-cnrc.gc.ca/research/optical_frequency_projects_e.html#optical

When you have finished reviewing my changes, please set the checked parameter below to true to let others know.

Cheers.—cyberbot II  Talk to my owner :Online 13:55, 27 February 2016 (UTC)

Should we explain the definition? Suggestion
I, for one, have absolutely no idea what a "period of the radiation corresponding to the transition between two hyperfine levels" is, and I doubt most of our readers do either. Should we not explain briefly - or at least try to explain briefly - what the current definition actually means? According to MOS:JARGON and WP:TECHNICAL, articles should strive to be understandable for a wide array of readers and minimize the use of jargon, and although this article overall is friendly enough, the definition carries an exception. I say someone should write a subsection to International second such as Explaination of definition, which explains some fundamental particle physics to give the reader a minimal idea of what actually defines the second. Rich with wikilinks to articles about atomic physics for further reading, of course. Anyone that can give me and our readers a hand in explaining transitions of hyperfine levels?Gaioa (talk) 15:19, 14 August 2017 (UTC)


 * The lead and the article are stilted and awkward, to say the last. The second is an ordinary everyday thing, like a mouse or a baseball. Ordinary people probably don't know or care that it is an SI base unit, don't know anything about atomic clocks (maybe they're atomic time bombs?), They're never heard of the International System of Units, though they probably know the metric system.  The meaning and definition of second as we know it, is an interval of clock time, 1/60 of a minute which is 1/60 of an hour, which is 1/24 of a day. It's about the time between resting heartbeats. I'm going to rewrite the lead, and probably chunks of the article, so anyone, say someone with only a couple of years of high school, can read the article, knows every word and concept, and doesn't need to go outside the article to look up anything. Sbalfour (talk) 01:39, 22 January 2018 (UTC)

SI multiples
This article is about the common clock interval second, and just barely about the SI unit. The section SI multiples says SI prefixes are commonly used to measure time less than a second, but rarely for multiples of a second, but then goes on to jabber: Thus a megasecond is 11 days, 13 hours, 46 minutes and 40 seconds, which is roughly of the order of a week.... No, 11-3/4 days isn't any kind of week. And listing powers of ten multiples of seconds is absolutely useless anyway. It's marginally useful or just interesting to list ordinary units of time in multiples of seconds, i.e., an hour is 3600 seconds, a day is 86,400 seconds, a year is 31.5 million seconds, a century is a little over 3 billion (3.15*109 seconds, and the age of the universe is about 434 trillion seconds. It may also be useful  conceptually to list a few  common events in terms of seconds: the fastest human sprinters take about 10 seconds to run 100 meters; a stone falling from rest falls about 4.9 meters in a second; sound travels about 343 meters in a second; ocean waves in deep water travel about 23 meters in a second.

Listing all metric prefixes here (and presumably in all 7 + 22 articles on metric base and derived units) is purely duplicative. There's a place to list those, and that would be in the article on the SI itself, or an article on metric prefixes. Here, it's just filler - nothing better to say, so say anything.

Sbalfour (talk) 17:44, 22 January 2018 (UTC)
 * One thing at a time - fixed. Sbalfour (talk) 18:28, 22 January 2018 (UTC)

"Atomic" second
Yuck. I don't think this can be fixed in any form. It's better to start over. What goes here? Atomic clocks keep better time than the earth. And if it makes sense, how much better time. Saying 10^10 of any unit doesn't make sense because nothing in our ordinary experience is 10^10 bigger or smaller than something else in our ordinary experience. At that level, it's just "millions of times better". Even mention of microwaves is suspect - while microwave ovens are an everyday experience, most people don't have a clue how they do what they do.

Sbalfour (talk) 04:24, 22 January 2018 (UTC)
 * Replaced. This is good. It certainly is readable by someone without a post-graduate degree in atomic physics. Maybe the physicists should write about mice or baseball. Sbalfour (talk) 20:00, 22 January 2018 (UTC)


 * After consideration that this section on hard science is the largest section in an article about an everyday thing, even though it is now reasonably accessible, it seems to me better placed in a science article, or moved into a footnote or sidebar "How an atomic clock works". We don't need to know anything about atomic clocks to completely understand the article. Most people don't have a good idea how a simple mechanical clock works, either; we don't explain it, and they don't need to know.

Sbalfour (talk) 18:04, 23 January 2018 (UTC)


 * Since the seconds used for everyday timekeeping are SI (atomic) seconds, they are the most commonly encountered seconds. A precise definition exists. Therefore the definition should be stated (in a reasonably understandable way) in this article. Jc3s5h (talk) 19:44, 23 January 2018 (UTC)
 * I've retained the definition in terms of that 9.2Ghz number, and simply state its a frequency of an excited cesium atom. Sbalfour (talk) 20:25, 23 January 2018 (UTC)

Equivalence to other units
This section is bullet items, like we'd find in an appendix or technical footnote. An encyclopedia article, especially one about an everyday thing, is expected to be narrative text. There's a readable way to write this, so I'm going to replace the subsections with text. An SI unit like becquerel isn't an everyday thing, so probably should be omitted; a baud is kind of an antiquated electronic term, and isn't that familiar either, in the sense that most people probably couldn't give it an adequate definition.

Sbalfour (talk) 17:48, 23 January 2018 (UTC)
 * Deleted, since covered in other sections. Sbalfour (talk) 17:27, 24 January 2018 (UTC)

History
The History section is over 80% of the article, but this is an article about an everyday thing, not an article on history. For that, we have articles like History of timekeeping devices. Proportionately, history would occupy only a fraction of a topical article about a common thing, like mouse or baseball. The section is elaborate and distracting. It would perhaps be plausible to merge the entire section into the named article above, because the history of the second is mostly the history of devices that realize the second, i.e. clocks. The article as it stands cannot support such a section. It needs to shrink by about a factor of 5, and even if it did, it'd still be almost half the article. Sbalfour (talk) 18:57, 22 January 2018 (UTC)

Fraction of lunar month section is curious for having only two obscure references to natural philosophers "defining" seconds. In that sense, anyone writing and publicizing (expensive in those days) could define a second. But definition and realization were different things: it'd require a clock that could keep time to the second for a month between recalibrations. Not only that, but a meaningful definition is one that is adopted by law, by practice, or realized by a scientific or metrologic body for use by others. France in the 18th century briefly flirted with decimal time adopted by law, for example. The experiment lasted less than three months. I'd consider that a definition of units of time. There's no evidence I can find of seconds as part of a lunar cycle in popular timekeeping systems or devices. I think this section should be moved to a footnote at best. It's scholarship in the abstract; lunar calendars are notable, but seconds (and minutes) as part of lunar timekeeping are not. Sbalfour (talk) 17:46, 24 January 2018 (UTC)
 * Moved to footnote. Sbalfour (talk) 21:44, 24 January 2018 (UTC)

Realization by mechanical clocks Squishy-fudgy. The second was ever only based on three things: 1/86,400 of mean solar day, fraction of ephemeris year, and 9.2ghz cesium clock frequency. Mechanical clocks realized the second, but when they failed to keep good time, they were recalibrated to the solar day.

Fraction of a solar day section is essentially about sundials, though it muddles about other things. It's unrealistic to try to divide a 1/12 of a circle or 300 (an hour) by more than about ten divisions. That's still 6 minutes. There was no effective way to time repeatable short events. As noted, none of these old divisions of time was a predecessor of the modern second. I think we can stop there; the only thing we borrowed from the Babylonians and their predecessors was their sexagesimal counting system and history of dividing calendar time as well as arcs in terms of it. Sbalfour (talk) 21:06, 24 January 2018 (UTC)

Fraction of an ephemeris year section contains some superfluous, wordy and partly duplicative text in the last two paragraphs. Tables of the sun/moon are stated three times for example. Sbalfour (talk) 00:00, 25 January 2018 (UTC)

clock time, calendar time, etc
This article on second is too small to form a proper article. Other than history, which is debatably WAY too big, the article is more or less an extended dictionary definition. I think there should be one article for clock time, including seconds, minutes, hours, and maybe quarter hours and half-hours, and another article calendar time including days, weeks, months, and years, and another article for larger time intervals: decade, century, millennium (1000), epoch (million) and eon (billion). There's already a threadbare article on units of time, though including historical units of time, there are just too many of these to cover in detail in a single article. Sbalfour (talk) 22:27, 26 January 2018 (UTC)


 * That could quickly become complicated, because in the past various cultures have divided the day into different units. As for the calendar, we have a number of articles on individual calendars. See Category:Calendars. So an article about days, weeks, months, and years would have to distill which calendars use them and which don't. Jc3s5h (talk) 00:35, 27 January 2018 (UTC)

Ambiguous definition of ground state
The article twice defines the ground state of the cesium atom. First, as the state when there is zero magnetic flux, and then as the state when there is zero magnetic field. I don't think these two statements are equivalent, and I'm curious if anyone knows which definition is more correct? --ABQCat 00:31, 30 Aug 2004 (UTC)


 * Magnetic flux is not applicable here, whereas magnetic field is. Magnetic flux is the total magnetic field passing through some area measured in webers, thus is a distributed quantity, whereas a magnetic field is applicable to a point, like an atom. We need not be concerned with the distinction between B and H (both point quantities) because it only exists in the metric system, or rationalized mks units. In CGS electromagnetic units they are one and the same, because permeability is the dimensionless unit 1. However, zero magnetic field need not be stated twice in the article. &mdash; Joe Kress 16:10, Aug 31, 2004 (UTC)


 * Thank you - I just didn't have sufficient technical knowledge to be able to correct the ambiguity. I agree with you that we don't need double definition of the second.  The article is a good candidate for cleanup by the community - it's messy and seems redundant towards the end.  Unless I'm missing something, I think the article could be re-written as much more concise and clear.  --ABQCat 16:44, 31 Aug 2004 (UTC)

January 0
What does January 0 mean in this article?? Georgia guy 02:38, 27 Feb 2005 (UTC)


 * 1900 January 0 = 1899 December 31. Furthermore, the rest of the definition has a specific astronomical meaning which is not obvious. I'll revise the subsequent paragraph. &mdash; Joe Kress 03:31, Feb 27, 2005 (UTC)