posted Jan 5, 2014, 10:27 AM by Sami Lehtinen
updated Jan 5, 2014, 10:28 AM
- Studied GNU Name System (GNS) and video.
- Checked out HEVC, Daala and VP9 video codes on basic level.
- Studied concept of Parallel Construction. It's neat. They can use what ever information they have. Nothing new yet.
- Funny article about Can do versus Can't do culture. It seems that historically it's not good idea to reject new concepts and technologies.
- Watched many USENIX FAST'13 Technical Sessions like: Keynote, SSD reliability under power fault, Caching, Fast File System Checker (How thinking how things should be done, can make things more efficient by order of magnitude!), Memory Efficient Sanitization and Deduplication of Data, HARDFS, Horus. Deduplication, File Recipe Compression, Virtual Machine Workloads and NAS performance, Improving Chunk Based backup restore speed (It would be nice if Duplicati would utilize this technique), ZIP or not, real-time compression, SSD Error Correction Codes, Performance Improvements and Measurements, A Study of Linux File System Evolution, Workload-Independed Storage using VT-Trees, Warming Up Stroage-Level Caches with Bonfire, Unioning of the Buffer Cache and Journaling Layers with Non-volatile Memory, Write Policies for host-die Flash Caches.
These are really high quality and easy to understand talks and recommended watching for every IT person. At least there was good stuff to watch for several days, instead of watching some mindless junk from TV.
- Really old stories: In one customer case, we talked about using virtual server from customers private cloud. Well, it turned out that getting VPS from Private Cloud would take about three weeks. So what did we do? We just went to shop and bought power work station and used it. Problem solved under two hours, instead of three weeks. So? Is it faster to get physical or cloud services? It depends, all benefits of private cloud can be hindered totally by complex and slow policies and processes.
- Really old stories: Once whens installing software in one data center from CD+RW disk I caused quite a panic with their IT staff. What did I do? Well, I tried to run my application as Administrator from CD+RW disk. And got message "Program too big to fit in memory". They went berserk. They thought that the binary was virus infected now now their network would get infected from inside. But I knew what the problem was. It wasn't my binary that was broken, it wasn't the disk. But it was the Compaq server CD-ROM drive. I don't really know what's wrong with those. But I have seen in many occasions the same problem, those drives do corrupt data. It's clear that there's something wrong with the CD error correction. Well, in this case, the disk I brought was first fully scanned for viruses, many servers were checked, firewall monitoring & logs were checked. Nothing was found, obviously. After I used another CD-drive to read the files and saved those to server using it, everything worked as usual. Btw. I did encounter this exact problem with at least Compaq servers. So I guess those got similar and buggy CD-drive firmware.
- Have you encountered enrageingly crappy mobile sites? I know a few
Finnish news sites which mobile implementation is especially bad. First
major mistake is that if I'm browsing news on desktop, then I'll open
the same article url with my mobile, it asks if I want to open a mobile
version. Well, then I answer yes. But then the front page of the site
opens in mobile mode. Why on earth, they don't open the news article I
was originally trying to visit? Instead they ruined my user experience
by offering mobile version of the site.