Wikipedia:Reference desk/Archives/Computing/2014 February 26

= February 26 =

Discussing technology
What does it mean to discuss technology with MIT and Apple? Dismas |(talk) 04:32, 26 February 2014 (UTC)
 * According to the cited Irish Times article (which you can read here), an Apple representative was going to "discuss how it can develop software for people who have disabilities", and she apparently wants MIT's robotics lab to make her artificial limbs. -- BenRG (talk) 07:30, 26 February 2014 (UTC)

Synchronized video replay
I'm wondering if there is an open-source solution to replay several synchronized audio/video streams, either by composing them into one large video stream (with non-overlapping tiles showing each video streams), or even by synchronously replaying them in different windows, possibly even on several screens (think "surround video"). Any ideas? --Stephan Schulz (talk) 14:09, 26 February 2014 (UTC)


 * Some options (including one using ffmpeg) to montage different streams together into a single tiled stream are discussed at this StackOverflow question. Or you can use AviSynth (or its Linux port avxsynth) which is in essence a gui-less (i.e. script driven) nonlinear video editing and compositing system (you'd use StackHorizontal and StackVertical). -- Finlay McWalterჷTalk 14:48, 26 February 2014 (UTC)
 * Ok, thanks a lot! --Stephan Schulz (talk) 16:21, 26 February 2014 (UTC)


 * Let us know how you actually get on (avisynth is reportedly rather hard to install). -- Finlay McWalterჷTalk 16:26, 26 February 2014 (UTC)
 * I don't install, I have people install for me (well, sometimes ;-). But I'll let you know if something useful emerges from this - so far, it's just a first feasibility test for a possible project. --Stephan Schulz (talk) 16:32, 26 February 2014 (UTC)

Dear Apple, why are we still programming as if we knew what the computer was going to do NeXT?
http://www.zdnet.com/apples-goto-fail-needs-a-massive-culture-change-to-fix-7000026783/

Why exactly are we still using programming languages that go step by step, when the CPUs are really doing Out-of-order execution?

What's the best most generally useful program language that lets one code out of order? Not do these things in this order, but simply check off this to-do list in whatever order works best for you. Hcobb (talk) 16:47, 26 February 2014 (UTC)


 * We don't, actually. Even C does not specify the order of execution of code between sequence points. Typically, functional languages like Haskell or the ML family give even more freedom for the order of execution. But typically, some results depend on earlier work. So simple "todo" lists don't work - you need something more like a project plan with dependencies. Indeed, you need more, as a proper programming language will allow you to specify unlimited or, indeed, infinite amounts of work in a finite program text via loops and/or recursion. --Stephan Schulz (talk) 18:29, 26 February 2014 (UTC)


 * The fact that a bug in Apple code was caused by a goto is something of a red herring. For sure, very few programmers still use 'goto' - it's widely recognized as "A Bad Thing" - mainly because it makes code hard to read and that makes it hard to debug.  However, a problem like this could equally have been caused by other code constructs too.   So all I take from this is that Apple have maybe just one, very old-school, programmer on their team - and either that person screwed up because they are sloppy - or someone subsequently altered that code and didn't even think of the possibility of there being a "goto" somewhere further up the code...which would have been very understandable.  But *ANY* piece of sufficiently long code will, for 100% sure have one or more bugs in it!   The blame here most likely lies with the QA department or some security expert for not finding the bug and punting it back to the programmers to get fixed before the code went gold.


 * So extrapolating from that news story to "Why don't we completely change how programs are written?" is a bit of a stretch.


 * However, the question of whether we could make new programming languages to make better use of out-of-order execution (OOOE) is an interesting one.  I have a couple of thoughts about this.  Firstly, the amount of benefit to be derived from OOOE is limited to making better use of parallel structures within the CPU (for example, reordering instructions such that a floating point arithmetic operation can be performed in parallel with an integer arithmetic operation).   These benefits generally only require fine-grained changes to the execution order.   It's unlikely that rearranging more than a dozen lines of code would produce significant savings.


 * Changing our programming languages to somehow improve this seems unnecessary. Compilers are very adept at creating execution order dependency maps and rearranging code for optimum OOOE performance - and humans are absolutely terrible at doing it.  The changes I could see to be beneficial would be in terms of things like branch prediction.   Right now, if I write "if (this) do_that ; else do_something_else ;", the compiler would really like to know which of those two outcomes is the most probable to occur at runtime.   Very often, I (as a programmer) know that one of these two branches is some kind of error condition that's almost never going to happen - and the other branch is going to happen almost 100% of the time.  So there would be value in having a language construct to tell the compiler that information.


 * There are any number of minor-use or experimental languages that pop up from time to time - they generally produce a flurry of interest, then die out in favor of languages like C++ that have been around for a long time. It's an evolutionary process.  C++ was a major improvement over C, so it's adoption rate sky-rocketted and C died out accordingly.  That wasn't because some academic said "C++ is better because..." it was because practical programming showed it to be significantly better.   One of the most heavily "designed" languages in history is probably Ada - yet it is hated by programmers and project managers alike - and it actually required a change in military procurement rules to get people to use it at all.  When that rule was eventually dropped, everyone breathed a sigh of relief and went back to C++ in droves.


 * Another example is PHP - which (on the face of it) is a diabolically awful language - but which happens to fit the mindset of the people who make web sites - so it's taken over from the seemingly better alternatives.


 * So if there was a significantly better way to program, I'm fairly sure that one or other experimental language would pop up to exploit it - and that language would take over the world in much the way that C++ did.  Since that hasn't happened and languages like Haskell and ML aren't at all widely used, we may deduce that they simply aren't a better fit for translating what's in the mind of a programmer into what a machine can execute efficiently.


 * SteveBaker (talk) 16:02, 27 February 2014 (UTC)


 * How about fully exploiting Simultaneous multithreading? Why should the programmer be the best judge of the tradeoffs of memory access vs recomputation? Hcobb (talk) 17:59, 27 February 2014 (UTC)
 * See automatic parallelization. Most code is not parallel enough to benefit from it once you consider the overhead of thread synchronization. Automatic vectorization (using SIMD instructions in a single thread) is easier and many C compilers do it. For what it's worth, I think there was research on automatic multithreaded execution of Haskell programs, but I don't know the status of it. -- BenRG (talk) 08:43, 28 February 2014 (UTC)
 * Regarding giving branch hints to the C compiler, GCC has __builtin_expect. -- BenRG (talk) 08:43, 28 February 2014 (UTC)


 * C is "out of order" in the same sense as x86 machine language. Compilers are free to reorder statements if they don't depend on one another. C's ordering constraints are actually weaker than x86's; for example, x86 guarantees that memory writes will become visible to other threads in order, while C doesn't.
 * It's common and mostly unproblematic to use goto to jump to error recovery code at the end of a function, as was done in the offending code. If the function were rewritten to eliminate the goto fails, they would probably be replaced with something like return false, with the error recovery happening in a wrapper function. Duplicating the return false would have caused the same bug. From my perspective, the real worries here are that Apple apparently doesn't do basic security testing of its Safari builds, and that they apparently allow code to be checked in without review, because I can't imagine any programmer looking at this diff and failing to notice the problem. -- BenRG (talk) 08:43, 28 February 2014 (UTC)


 * I don't have much to say about the original question, but about the Apple thing, this from someone who works on similar sort of stuff for Google is not so harsh about the testing issue. They agree ideally it should have been picked up but mention because there are so many things to test there's always a possibility your test suite won't test one possibility and that they're not sure if their testing did so at the time.
 * I know most in this discussion would understand it better than me if they looked it to it (and even I haven't looked that well), but I wonder if there is some confusion about the nature of the bug. In particular, when it fails. I was initially under the impression Safari would report most things as secure when they aren't but reading a but more, as I understand it, it's the sort of bug easy to exploit once it's known, but is not necessarily the sort of attack you would always test for.
 * The writer does agree about the apparent lack of review though.
 * P.S. In case there is some confusion from SB's post, as emphasised by BenRG and his? source and also my source, the goto statement was duplicated or consecutive. It wasn't simply a case of another goto statement somewhere else but rather right in the next line.
 * Nil Einne (talk) 20:12, 28 February 2014 (UTC)

skype problem
Hello, This morning my skype bugged and such, and had some update or such, but now when i voice chat, my voice comes out in reverse and the other person hears it backwards, reversed. :(

What can i do ? ;_; I don't want to stop talking to her. :( — Preceding unsigned comment added by 31.209.159.215 (talk) 18:04, 26 February 2014 (UTC)
 * Do you have an app like Clownfish installed? (see Tools -> Options -> Advanced --> Manage programs' access to Skype) --&mdash;  Rhododendrites talk  |  19:53, 26 February 2014 (UTC)


 * This seems a bit unlikely - I mean, Skype is interactive - you turn it on, start speaking, and the audio comes out at the other end some fraction of a second later. Clearly, your entire conversation can't come out backwards because then your voice would start saying things (backwards) that you haven't yet decided to say!    So is this "glitch" reversing each word?  Each sentence?  Every block of (say) 5 seconds?  This all seems kinda unlikely to me.  It can't possibly be any kind of bug in Skype.


 * Are you absolutely sure this effect is really happening?  I think it's overwhelmingly more likely that the person at the other end is joking with you.
 * SteveBaker (talk) 15:03, 27 February 2014 (UTC)
 * I agree true time-reversing of the signal is unlikely. But I have heard people describe various types of audio noise/distortion as "sounds almost backwards". I use skype a lot, and have had similar problems with distortion/clipping, etc. Usually they just go away after a few days. I've always assumed it's "internet weather" getting in the way. SemanticMantis (talk) 16:01, 27 February 2014 (UTC)
 * Clownfish is a translation app. There are incidents people have reported where it's done this. --&mdash;  Rhododendrites talk  |  07:22, 28 February 2014 (UTC)