Torrent, DMARC, Outlook, Baud, U2F, Software, Cloudflare
Tested IPv6 only trackers with Deluge 2.0.3 and Transmission 2.94 BitTorrent (@ Wikipedia) clients with Linux on IPv6 only networking. Both seemed to work well. Good 2 know. I've been sometimes wondering how little IPv6 peers there are out there. Simple and light test, peer announce and discovery of that announced peer with another client. Using my self-published test-file to make sure that there are no other than IPv6 peers around as well. I also tested Tails and Ubuntu images over both networks. Sure, all software should work with IPv6 without any problems right now, but unfortunately it's not that obvious always.
Found out funny stuff, darkentdiaries.com _DMARC (@ dmarc.org) subdomain TXT record says "v=DMARC1; rua = mailto: address _at_ yourdomain.com". Copy-paste hacking.
Thank you Microsoft / Outlook - Again, you've been eating around 50% of my emails, and even bounces don't work. But I guess this is nothing new, if you've been reading my blog. In general, the email services are straight out of ... Delisting portal also supports only TLSv1.0, fail again. Secure connection failed. How much Special High Intensity Training you guys are getting there? - Anyway, it seems that you can use cloud and get way bad experience, or of you can self-host, and get a great experience. All this cloud lobbying sometimes makes me cringe.
Aha! Noticed that a techie whom got the details wrong. He said 2400 baud (@ Wikipedia). Nope, after asking did he mean baud or bit rate (@ Wikipedia) bits per second (bit/s), it turned out that my guess was absolutely right. I assume you know the difference, if you're reading this. Also see: Baudot code (@ Wikipedia) and Émile Baudot (@ Wikipedia).
Found one site, which is broken so FIDO U2F (@ Wikipedia) token registration doesn't work. So classic, so classic. They've implemented new feature, which doesn't even work. Yes, I'm using the FIDO U2F and Passwordless login with many many services.
Software and practices are sometimes down right scary. I wonder if banks and other institutions are similar. A few guys in hurry, do something and it's put straight into production. Sure, it might work, but most probably there will be issues. But worst part is, how serious those issues are and how easy (or hard!) those are to fix later. I personally would prefer lot higher standards. Where I know, it's going to work. Of course there will be the unknown unknowns, but at least there won't be so many known unknowns. Sure this approach also "works". But it's kind of strange how important things are handled in a "slightly tested" and "we've done something similar like this before" and "I'd guess it would work" based code. I can't tell which case this is all about. But in general, I kind of wish other key personnel wouldn't get put in this kind of situation. It's also a way to create fires, because well, we build this electric device and it might burn when we turn it on. But nobody has bothered to pre-check if it actually starts a fire. - But viewed from another perspective, sometimes doing everything in procedurally perfect way, would just make it 20x more expensive, lot slower and the end-result wouldn't probably change and it would get delivered of course lot later. - I've also seen those cases. Where things which could be done in a few weeks end up taking years. - After all, it's all about delicate balancing act. There's always the golden mean somewhere.
Fine tuned DMARC settings, now I'm getting only reports when there's abuse detected. After slowly ramping it up, and verifying that everything that should pass, should pass. I also found a few abusers while launching DMARC features, so that paid of as well. Hopefully it'll also lead to better domain spam reputation.
Cloudflare's new network speed test seems but wonky. Strange they didn't test it before release. Well this is just launch early fix later approach. Of course just like with any test, it all boils down to: What's being tested and now. Also with CF test they're using short bursts, which achieve very high speed, because data gets through before the traffic shaping (throttling) kicks in. In my case the CF speed test shows 10x higher speed than what the actual prolonged transfer speed will be due to traffic shaping / speed limiting.