posted Sep 18, 2016, 5:56 AM by Sami Lehtinen
updated Sep 18, 2016, 5:57 AM
- DiskFiltration: Data Exfiltration over Hard Drive Noise - Yes. Anything over anything, as being said. If it allows "any form of transfer of data" it can be used to bridge anything over it. Nothing new in the field of data ex-filtration. Air-gapping / Air-gap won't protect your systems. Also don't forget the classic TEMPEST leaks. Some people claimed that wired keyboards don't transmit wireless signals. But that's total BS. Most of electronic devices do send wireless signals. Devices COULD be made so that those won't transmit wireless signals, or transmit much weaker wireless signals, but that would be much more expensive. And therefore, manufacturers naturally just don't care. Every wire is technically just an antenna. This is also yet another (not new) covert communication channel example. Non surprisingly they list stuff like: Electromagnetic, Optical, Thermal, Acoustic (physical vibration) based communication channels. All of those allow communication in some for or another. If the data center got Internet connected temperature monitoring system and racked air-gapped servers. The temperature monitoring system can be used to extract data from the air-gapped servers, etc. But isn't that slow? Sure it is. But stuff like encryption keys do not actually require a lot of data. Other obvious stuff like a led being visible in security camera or so.
- Short summary about stuff we've been talking about with friends: Web of Trust (WoT), identity management. White listing, black listing, different trust models, approval processes and process automation. Signatures, identity verification. Effects of Sybil Attacks. How to detect promiscuous users. Trust should be based on context and perspective. Trust scoring. Detection possible identity & reputation manipulation attacks. Trust and data crowd screening and reporting. Different rating models. Stakeholders, proof of reserves. Liquidity, bidding systems and auctions. Possible Timed English Auctions. DSA/ECDA threshold signature scheme. WoT can be screwed by stupid users. As well as some people aren't simply able to maintain even basic key management privacy requirements. Or misidentify the trust level of keys in WoT. Because there are always disputes, it's a great question if it's about terrorists of freedom fighters. What view is the 'correct one', and how truth is defined after all. Who says that the majority's opinion is the right one? Pressuring peers with giving negative feedback as attack or as way to coerce into co-operation. Several problems with revocation message and actual revocation data distribution. If other peers are supposed to update data with which they can't properly sign without having the private key. Of course in this case it could be possible to delegate the signing using per signer secondary public / private key combination to sign the final document. And limit the access in signature made using the own key to the individual document being signed with the secondary key. That would work. So someone else can be authorized by me, to partially update document, which I've actually signed and the updated part is signed using delegated key. Just like I described in OpenPGP / GnuPG ephemeral keys blog post. That's trivial.
- In one case, user access token was good after an year, even if the user had been fired. That's the normal truth of user access control. Even if vendors claim all kind of stuff, it doesn't matter, because it won't happen anyway.