Title: Developers as a Malware Distribution Vehicle
Speaker: Guy Podjarny @GuyPod
See the table of contents for more blog posts from the conference.
Developers have more power than ever – can get more done and faster. Can also do more harm.
XCodeGhost – in 2015
- XCode went from 3GB to 5GB
- Too slow to download in China
- Developers use a local mirror
- Have to trust unofficial download
- XCodeGhost is XCode + a malicious component that compiles in to the OS. It targets the linker.
- Went undetected for 4 months
- Contamiated hunreds of Chinese apps and dozens of US apps
- US got it fro Chinese built apps and via a lirary
- Got up to 1.4M active victims a day
- Apple fixed in AppStore imediately, but took months for users. Including enterprises
- The real “fix” was to take down the websites were contacting
- Apple fixed root problem by hosting official XCode download in China
- Because targeted linker, developers were the distirbution vehicle.
Delphi virus – Induc – 2009
- Targets Delphi
- Every program copiled on machine is affected
- Even if uninstall and reinstall Dephi, it stays
- Took 10 minutes to find
- No app store, so harder to remove
- Affected millions
First instance of this concept – 1984
- ”Reflections on Trusting Trust” – Ken Thompson
- Modify C compiler to “miscompile”
- Three trojans – allow a hard coded password, replicate the logic in C Compiler and use a disassembler to hide and deletes from source code
- Wrote a proof of concept. Think didn’t escape Bell labs
- Can’t find. Not in source code and can’t disassemble
- Best soluion is to compile on two computers/compilers and compare the output. Not practical.
- npm bad dependency
- pipy bad dependenc this year
- Docker bad image this month
Must trust the people who write the software.
We ship code faster. Hard to find if deveoper introduces code maliciously or accidentally.
Developers have access to user data Be careful
Syrian Army and Financial Times
- phishing email
- link redirects to finanicial times spoofed page
- now have emails so send emails that look like from finanical times
- IT attempted to warn users.
- Attacker send identical email with evil links
- Gain access to official twitter
- Syrian Army use to make statements
- A developer noted that think wise to this and still fall for it. We all fall for this.
- Salesforce did an internal phishing test and developers were the second higest clickers
Uber – 2016
- Attackers got driver and user data
- Uber paid 100K ransom. Agreed later that shouldn’t
- Public found out a year later
- Developers had stored S3 token in private github repo
- Not using 2FA
- Deveopers can access extremely sensitive data and share it too often
As we get more power, we need to get more responsible
Causes of insecure decisions:
- Different motivations – focus On functonality. Security is a constraint. Need to be cognizant of it
- Cognitive limitations – we move fast and break things
- Lack of expertise – don’t always understand security implications
- Developers are overconfidence. Harder to train where think know it.
- ”It doesn’t happen to me” . Security breaches happen to everyone.
- Learn from past incidents
- Automate security controls
- Make it easy to be secure
- Developer education
- Manage access like the tech giants
- Challenge access requests. When need. For how long. What happens if don’t have access. What can go wrong with access? How would you find out about access being compromised?
- All access route through corporate proxy
- Proxy grants access per device – limits what can do from Starbucks
- Monitoring access
Microsoft Privileged Access Workstations (PAW)
- Access to production can only be from a secure machine
- No internet from the secure machine
- Your machine is VM on secure machine
Great start to the day. I had known about some of these, but not others. For some reason, this reminds me of developer ghost storires.