[devnexus 2026] privacy by design (pdb) in devsecops

Speaker: Anita Dakamarri

See the DevNexus live blog table of contents for more posts


Why developers

  • Developers are first line of defense
  • Gap between dev and security teams. Rivals a decade ago. Now same time. Rivals are attackers outside
  • Developers use untrusted images to meet deadlines
  • Business people/leadership want it fast and bypass security. Then blame developers and security people
  • No software is vulnerability free. Goal is to reduce vulnerabilities.

Famous Data Privacy Breaches

  • Equifax – in 2017 – unpatched Struts vulnerability. Got Social security numbers, birthdates, etc for 147 million people. Cost over 1.3 billion. Fired security people and executives
  • United Health – in 2024 – Ransomware attack on Citrix remote access portal without MFA. Exposed medical, insuring, billing and personal data of 192 million people. Billions in recovery costs, ransom of $22 million and lost revenue
  • BadeSaba Calendar App – in late February/early March – Iranian app hacked to include messages like “help has arrived”

Privacy By Design Principles

  • Proactive, not reactive. Preventative not remedial
  • Privacy as the default setting (ex: car automatically has a seatbelt)
  • Privacy embedded into design
  • Full functionality, postive sum, not zero sum
  • End to end security; full lifecycle protection
  • Visibility and transparency; keep it open
  • Respect for user privacy; keep it user centric

Requirements/planning

  • Identify personal data early
  • Minimize data collection
  • Definite lawful purpose and retention
  • Conduct privacy assessments PIA (privacy impact assessment) /DPIA (data protection impact assessment)
  • Translate privacy laws into requirements – ex: GDPR, CCPA (California Consumer Privacy Act)/CPRA (California Privacy Rights Act)/ HIPPA). Requirements include consent, access and deletion

Code with privacy

  • Avoid hardcoding sensitive data – Never embed secets, API keys or personal data in code/configs
  • Mask/redact personal data – especially in logs/error messages/debugging
  • Implement strong encryption – user modern, vetted crypto libraries
  • Validate data inputs – prevent injection/data poisoning attacks
  • Build deletion and portability features – ensure data can be deleted/exported programmatically

Testing

  • Test privacy requirements
  • Use anonymous or synthetic test data
  • Perform security and praivacy testing

Deploy and Release

  • Secure configurations
  • Enforce encryption everywhere
  • Apply access controls
  • Document privacy notices

Runtime and Operations

  • Runtime privacy practices – monitor access to sensitive data, alert on anomalous data queries, automatically enforce retention policies, tokenize/anonymize analytics dat
  • Incident readiness – breach detection hooks, pre defined response playbooks, forensic ready audit logs

Shift left

  • Security requirements
  • Diagrams – dataflow, network flow, authentication flow
  • Supply chain risks
  • SAST
  • SCA/BOM
  • Secure coding standards
  • Secure coding training
  • DAST
  • Autonomous testing
  • Document security issues

Key takeaways

  • Privacy must be treated like availability – must have requirement
  • privacy + security + usability at a time is possible and cost effective
  • Challenge is invisible data flows in modern architectures
  • Privacy is blurred with AI – ex: cameras on street, airport
  • Shortage of privacy + AI skilled engineers

My take

Good end to the day. I learned some acronyms like DPIA and CCPA. So nice to see a session about privacy and not just security overall. I like the checklist by lifecycle phase slides.