Wikipedia:Reference desk/Archives/Computing/2018 November 11

= November 11 =

mass conversion to MP3
I drive a vehicle that can play MP3 files from a thumb drive. As fate would have it, most of my music files are in another format, being from CDs untimely ripp'd by iTunes, so I didn't have quite enough music for a recent six-hour trip.

Is there (in MacOS) a convenient way to copy thousands of AAC files to MP3? —Tamfang (talk) 05:09, 11 November 2018 (UTC)


 * Depending on your definition of convenient, it can be done in itunes. HenryFlower 15:30, 11 November 2018 (UTC)
 * Or not. Where that page says I should look for “Create MP3 Version” in the menu, I find only “Create AAC Version” (from AAC files)! —Tamfang (talk) 23:07, 11 November 2018 (UTC)
 * On the assumption that iTunes hasn't changed too much of late, you're not wrong—the "Create xxx version" is relative to the encoder settings in preferences. Whatever iTunes is set to rip to (from CD), that is the option you get in that menu. I don't have iTunes on the machine I'm currently on so can't check (and even if I did it'd probably be an old version). MIDI (talk) 08:45, 20 November 2018 (UTC)
 * Ah. Thanks, that appears to have worked. —Tamfang (talk) 18:00, 21 November 2018 (UTC)


 * If you are willing to work in a command box, FFmpeg can convert audio files (not just video). Graeme Bartlett (talk) 22:36, 11 November 2018 (UTC)
 * If by "command box" you mean Unix shell, that's fine, I didn't get where I am today without writing bash scripts! —Tamfang (talk) 23:07, 11 November 2018 (UTC)


 * I have found Foobar2000 to be a great program for mass converting to MP3. It can resample, filter, make it so the loudness doesn't vary so much from song to song, etc. I just run it on my Windows box (I use Windows 10 where I have to and Slackware whenever I can -- love Neovim, hate Word, use Libre Office when a client insists on docs in Word format) but they claim to have a version for mac [ https://www.foobar2000.org/mac ]. I haven't tried it, but Foobnix might be an equivalent for Linux: [ http://foobnix.com/en/description.html ]. If you try either of those, please drop me a line on my talk page so, if possible, I can do one less thing on Windows and one more thing on Linux. --Guy Macon (talk) 23:26, 11 November 2018 (UTC)

CAD and gaming graphic cards
I was told that CAD software and gaming software use kinda different graphic cards. CAD professionals would prefer Quadro and Gamers would rather buy GeForce. Basically the rationale is that CAD need fidelity up to the pixel, and games need FPS, the more the better. However, what's the big deal if CAD software gets one pixel wrong? It's not as if we could see individual pixels. How big of an issue is this distinction? --Doroletho (talk) 20:41, 11 November 2018 (UTC)
 * CAD users will use large monitors where they probably can see one pixel out. If the freehand drawing is out then the design will be wrong, so the monitor should represent what the designer wants to draw accurately. I don't expect graphic cards will make this kind of error anyway, but the CAD designer will need large and multiple monitors, possibly with good colour resolution and depth. Graeme Bartlett (talk) 22:34, 11 November 2018 (UTC)
 * The needs are different, but what speaks for the need for a different GPU? Couldn't they make a multi-purpose GPU? What makes the design of an architecture for speed be different than designing for precision? After all, CAD or games, the GPU must be flexible enough to deal with different kinds of CAD and games. --Doroletho (talk) 22:53, 11 November 2018 (UTC)


 * The requirements really are different. With games it's all about speed, speed, and more speed, and if the trees whipping by on a blur or the rocket that is about to kill you look a bit off, nobody cares. With CAD, you do care.


 * Drivers are also a big difference. The CAD vendors write drivers that are highly-tuned to the exact specifications of the CAD cards, and there is very little variation for them to deal with. Gaming cards come from a multitude of vendors who often deviate from the original reference design. This is really the main dealbreaker. If you are using, say Solidworks, you use the computer and graphics card they recommend and support.


 * Another huge difference involves power consumption. Gaming cards produce a lot of heat, but they generally do it in short bursts. CAD cards are used for 8-12 hour work days, and are often given 24/7 rendering tasks in the background. They are also quieter; a bunch of screaming fans pumping a ton of heat into the room might be OK for a gamer wearing headphones, but not in an office with 30 workstations in one room.


 * And don't even get me started on the difference between mechanical CAD and PWB layout... --Guy Macon (talk) 23:50, 11 November 2018 (UTC)
 * Another major detail is whether the vendor of a specific software - say, Autodesk (the makers of AutoCAD and many other popular tools) - have specifically certified that they design and test their software on a specific graphics hardware and software configuration.
 * The vendor may publish generic requirements - for example, the Autodesk AutoCAD tool is compatible with any "Direct3D®-capable workstation class graphics card" - but when you're buying software that costs more than your house, it's worth knowing that a team of engineers will stake their professional reputation on the specific details, and will publish documentation and provide support for your configuration.
 * Nimur (talk) 18:47, 12 November 2018 (UTC)
 * I won't restrict this answer to CAD since workstations cards are used for a lot more than CAD work. One thing that wasn't mentioned above is that workstation cards often have feature sets that are excluded from gaming cards sometimes for market segmentation reasons, sometimes for costs reasons. (In the fairly distant past, some of the cards were similar enough that you could BIOS mod a GeForce so that it appeared as a Quadro or I believe likewise for ATI. This enabled whatever was disabled, often specifically driver support although for the later, simply hacking the driver was another solution. But things have moved on since then.) A common recent difference is in double precision performance or support although of course not all applications (by which I don't just mean programs) will use those features see e.g. these discussions [//forums.autodesk.com/t5/revit-ideas/stop-pretending-quadro-cards-are-better-than-geforce/idi-p/6906270] [//www.reddit.com/r/NukeVFX/comments/7p88c9/quadro_vs_gtx_vs_amd/]. Deep colour support is I believe an even more recent feature in this vein. ECC RAM is another one although that's obviously primarily cost related. (Although I suspect vendor would add ECC to a GeForce or AMD 'consumer' card if they could so I guess you could still say the exclusion from the chips is most likely largely for market segmentation reasons.) This although a user comment [//www.anandtech.com/comments/13217/nvidia-announces-turing-powered-quadro-rtx-family/612361] is IMO illustrative of the complexity. At the high end, the are also differences in best speced card, especially regarding RAM since as crazy as gamers can be, they still often have budgets and Nvidia or AMD likewise need to make a profit so aren't going to release something if few are going to buy it. And adding 48GB of high performing RAM to a card is going to make it somewhat expensive. (And it needs to be the top card since again as crazy as some gamers are, most are going to realise they will be loled out the door when they show off their fancy rig with a fancy card with 48GB of RAM which will get a lower FPS in every single game than that other fellow's rig with a far cheaper card.) Nil Einne (talk) 16:34, 13 November 2018 (UTC)