In anticipation of his keynote at HITB Security Conference 2020 in Amsterdam, we talked to Jon Callas, a world-renowned cryptographer, software engineer, UX designer, and entrepreneur.
Before joining the ACLU as senior technology fellow, he was at Apple, where he helped design the encryption system to protect data stored on a Mac. Jon also worked on security, UX, and crypto for Kroll-O’Gara, Counterpane, and Entrust. He has launched or worked on the launches of many tools designed to encrypt and secure personal data, including PGP, Silent Circle, Blackphone, DKIM, ZRTP, Skein, and Threefish.
You’ve been in the cybersecurity industry for a long time, taking on a variety of roles. What advice would you give to those just entering this industry? What pitfalls can they expect?
There are things that have been true for technical people for decades and will continue to be true.
Expertise gets common, gets automated, and then the people push buttons on the automated tool think they are experts; they might be. About half the things you know will be obsolete after five years, so you’ll have to learn new things and maybe pivot your career.
The best thing to work on is always something that excites you. Everyone does a good job on what they like and bad on on things that bore us. When (not if) you need to make a change, it might take a couple of years. A once-in-a-lifetime opportunity will come to you every year or two. If you miss this one, there will be another. And yet, the right opportunity never comes at the perfect time.
Technology changes, people are the same. People will always be lazy. They’ll always forget things and lose things. Assume stupidity over malice. Build your systems so they take advantage of people’s flaws when you can, or at least won’t be destroyed when they don’t know and don’t care.
Year after year, data breach losses continue to rise. What is the cybersecurity industry doing wrong? There’s plenty of innovation, yet most organizations fail at basic security hygiene.
I think you’re hitting on the exact thing. It’s closely related to what we were talking about before — people are lazy, stupid, and don’t want to spend money. They will want to know why then need to buy a lock if no one has broken in.
A cybersecurity company will have a brilliant idea, and that brilliant idea will be a solution to some problem, and often prevention would have worked better. Meanwhile, it’s really hard to sell prevention both as a company and as a cybersecurity group. It’s hard to show metrics about what was prevented.
Thus we have a kind of evolutionary process here. The companies we see being successful are the ones selling things people want to buy. There are a lot of companies selling things people need but they don’t want to buy and those companies struggle.
That’s why what we see of the cybersecurity industry is not addressing these basic issues. And yet, the organizations that are failing are failing because they don’t want to do those basic things.
I snark that CISO stands for Chief Intrusion Scapegoat Officer. The CISO is the person that you fire because the bad thing they said was going to happen unless measures were improved really happened. It’s their fault that measures weren’t improved, right? I know security officers who have left their job because they weren’t being listened to and knew that the inevitable breach would be blamed on them.
What’s your take on the global privacy erosion brought on by large social networks?
I’m really glad to see policy reactions coming from that. I like GDPR. I like CCPA (the new California privacy act). No, they’re not perfect. As time goes on, likely we need to tweak or come up with interpretations of the gray areas in each, but they’re good. We need both policy and technology to protect us, along with privacy norms. We technical people tend to scoff, but norms work.
Today, most web sites are using TLS and that’s a norm; we expect that a site will use TLS and that expectation is a norm. The technical backing for that new norm is that we changed from presenting a lock for TLS, but for saying that the lack of it is not secure.
How do you expect encryption technologies to evolve in the next decade? What would you like to see implemented/created?
I expect that we’ll see a number things sorted out in choices for post-quantum public key crypto, but still talking about the eventuality of quantum computers. I expect we’ll still be waiting for homomorphic encryption to be efficient enough for the uses we’d like, as well as waiting for multiparty computation to speed up more. I expect we’re still going to have law enforcement wanting to get into encryption, as well.
In related fronts, I’m hoping we’ll have more verification like certificate and key transparency, formally verified implementations of important algorithms, and a number of interesting new protocols.
I think that the important thing for us all to remember is that encryption is a technology that implicitly rearranges power. It is implicitly political as well as personal. I think that this is why everyone finds it alluring.