OVH IPv6, RAM vs Disk, Repetier, 10x Programmer, Sigularity, WhatsApp, Maintenance & Cleanup

Post date: Dec 3, 2017 7:41:05 AM

  • OVH Management Console IPv6 zero compression still working incorrectly. I'm laughing so much. I don't know once again, which is harder. Is it hard to understand how to compress addresses. Or is it hard to make working code. And when they're unable to produce correctly working program, how hard it is to fix it. This is ridiculous. But just so common, that it's business as usual. I can also confirm that this bug only affects the management console. The initial email + API does return correct uncompressed address. (from backlog)
  • RAM vs Disk - Faster for cache? Might not be that obvious. It would be nice to hear guesses and speculation & reasoning why either option would be faster or not and how much.
    • Let's assume that the cache server is used for following setup. 1 KB - 1 MB contiguous data blocks, with a short 64 bit key or something. Data is cache / temporary data, so there's no requirement for data persistence. It's ok to lose all data if the process is restarted. Server got 2 GB of ram, and let's say that the cached data is limited ton 64 GB bytes.
    • Now, is it faster to allocate memory to store the cached data and let system swap it out as required. Or is it faster to save the data blobs as files, and let the system cache file system to RAM? Which options is faster and why? - Is there obvious answer? Next interesting questions could be if the test is actually run a with Linux and Windows system, does the operating system make any difference, if exactly same Python, Go or C code is run? Data read and writes will would follow random pattern with Pareto address access distribution. 90% read, 10% write. Blocks would be randomly sized between 1 KB to 1 MB with even randomly distributed size. Would if change anything, if Pareto distribution would be used preferring small blocks? Would it change anything if the read and writes would be balanced with 50%, 50% ratio? Thoughts? Guesses? This is just a thought play. Which could be tested trivially.
  • Some 3D printing stuff. Repetier on Linux (Mono, System.Windows.Forms 4.0) & Windows. FreeCAD, CuraEngine, Slic3r, 3D printing, PLA, ABS. It took quite a while to get everything tuned. But now things are working pretty nicely.
  • I totally agree with this post. Yes, there are 'mythical' 10x programmers. They are 10x faster and more productive than others. I'm also sure there are 0.1x programmers too. I've seen some, even extremely simple basic tasks end up taking weeks and still failing.
  • The singularity in the toilet stall - Article contradicts itself in several cases. Maybe that's just to show that these things aren't simple? But yes, generally I agree that some low level things work extremely work and reliably. Over engineering some stuff makes it just unstable, fragile and brittle. This is just the elegance in programming. It's easy to make something extremely cool. Which is actually something horrible. Keeping it simple, functional and reliable is the way to go. - The point I especially started wondering was. 'Is online news full of falsehoods? Add machine-learning AI to separate the wheat from the chaff'. Wow. It seems that we don't need courts any more. We just need AI which tells us what's the truth. That's neat. It's impossible to tell what's falsehood and which isn't. When we're talking about something which isn't exact science. Actually why isn't there global 'the truth' site. Which would make sense of all the scandals and wars, and tell us the actual honest truth about the situation. It would be so nice.
  • WhatsApp and storage space management. It seems that they've got fail programmers there. How about deleting non-referenced lingering junk from storage media? Nope? Too hard for you guys? - This is exact opposite to the ridiculous vacuuming of database by Spotify. - I just deeply hate applications and programs, which do not remove all kind of crap, junk and $hit those are scattering around the file system. Yuk! So when something becomes unnecessary, delete it immediately or at least clean it up in some kind of clean up batch run a few hours or days later.
  • I always add automated maintenance tasks to my projects. Taking care of junk, logs, temp files, unnecessary database records. And compacting the database, at some sane interval. It might be daily, weekly, monthly or yearly. Depending on task at hand. But it's always included.
    • Btw. Just as bad are some system admins and operators, whom just seem to poop around the system, without any kind of logic.