VNC, Binary, Quality, Win10, Java

  • RealVNC doesn't work with bad networking. UltraVNC viewer did. Usage is naturally painful, but it's auto reconnect actually works. Also the application itself seems to hang, but you'll just wait it out and it works after all. - Haha, sometimes networking is just beautiful. - I managed to get it done with UltraVNC. The network connectivity got lost about every two seconds. If you sync the reconnect correctly, you'll get a few key presses / one mouse click done per connection. Who says it wouldn't work. Got the job done. - I were really upset that RealVNC was so bad in this situation. But UltraVNC was better than I expected. - Legendary stuff, this is so beautiful. Why would losing network connection every two seconds be a problem. You'll just need to do it right and you can get the job done in sub-tasks. Selecting stuff from drop down menus were quite painful. Especially for menus which won't show completely but require scrolling. In that case, hitting the first letter instead of trying to use the mouse was the best approach. VNC
  • It's not so hard to deal with short bit mapped binary blobs. It seems that people have no clue about hex, binary, bitwise operations and in general handing binary data nowadays. They said that HEX is so hard to understand. Also one guy claimed that transferring XML data over serial link is slow. Sure it is, how about not sending bloated junk over known to be slow links? And using something more sane protocol, without insane amounts of overhead. - Sum it up, no understanding of binary, no understanding of hexadecimal, no computing basics understanding. Well, after I showed them that you can use HxD instead of Notepad, working with binary data is much easier.
  • Even more quality software, it seems that at times Transmission loses lot of data, while saving finished file. Awesome achievement, corrupting data, when everything has been verified to be ok. Soo much FAIL! This seems to happen when you've defined alternate download location. The part files are written as usual, but then the completed output file is corrupted or lacks some data. Interestingly it also seems that the file in part storage isn't fully completed before this. Because if you then try to recover from that situation, the system keeps re-downloading some parts of the file, even if it was supposed to be complete? I've encountered this issue several times. It seems that the safe(r?) way is not to use alternate download location(s) at all. I've gotta run a few diffs, to verify a few things. After a few tests, it seems probable that the Transmission uses some strange way of copying file to final destination while seeding it. Which might lead to the file handle getting closed and the storage medium ejected while file isn't yet complete. I've tried several times with cp and other methods, and I don't see the problem. It only happens when transmission itself is copying files. Source code would reveal it's secrets, if I would care enough to check it out.
  • Sometimes I think that software quality is so disastrous, that you should actually be extremely happy, if something never ever happens to work even half way as supposed. So many things are broken or absolutely broken, or so badly implemented it's like absolute trolling and or booby trapped code to annoy people.
  • How it is possible that Windows 10 computers are sold with 32 GB SSD. It's not enough to run the updates. It's barely enough (might not be actually enough even to run updates with external drive. Horrible setup. Sure there are ways how experts can update the system, but for normal users using media creation tool and basically 'reinstalling new windows' is something which isn't recommended approach, even if it's doable.
  • Java serialization issues. Well, that's obvious. Usually there should be two kind of mechanisms. One, very fast, very powerful, which is being used for trusted data. And then something which validates everything in detail, and is therefore much slower. The first option is great and must have for performance and versatility. And the second one is for situations, where untrusted data sources are being processed. It doesn't make sense to ruin the first option due it's "security issues", because there are no security issues. It's being used only for trusted data. Duh! Sure programmers do mess up with these things every now and then. But it's not a technology flaw.
  • Something different: Number stations

2019-11-03