GDPR/DSGVO Expert
Navigate EU GDPR and German DSGVO compliance — data processing agreements, DPIAs, privacy policies, consent management, and data subject rights workflows.
What this skill does
Ensure your product meets European privacy standards by automatically identifying data risks and generating the necessary compliance documentation. You can produce privacy impact reports, track user data requests, and get clear instructions on fixing potential risks before they become compliance issues. Reach for this tool when launching new features or preparing for an audit to stay compliant with GDPR and DSGVO regulations.
name: “gdpr-dsgvo-expert” description: GDPR and German DSGVO compliance automation. Scans codebases for privacy risks, generates DPIA documentation, tracks data subject rights requests. Use for GDPR compliance assessments, privacy audits, data protection planning, DPIA generation, and data subject rights management.
GDPR/DSGVO Expert
Tools and guidance for EU General Data Protection Regulation (GDPR) and German Bundesdatenschutzgesetz (BDSG) compliance.
Table of Contents
Tools
GDPR Compliance Checker
Scans codebases for potential GDPR compliance issues including personal data patterns and risky code practices.
# Scan a project directory
python scripts/gdpr_compliance_checker.py /path/to/project
# JSON output for CI/CD integration
python scripts/gdpr_compliance_checker.py . --json --output report.json
Detects:
- Personal data patterns (email, phone, IP addresses)
- Special category data (health, biometric, religion)
- Financial data (credit cards, IBAN)
- Risky code patterns:
- Logging personal data
- Missing consent mechanisms
- Indefinite data retention
- Unencrypted sensitive data
- Disabled deletion functionality
Output:
- Compliance score (0-100)
- Risk categorization (critical, high, medium)
- Prioritized recommendations with GDPR article references
DPIA Generator
Generates Data Protection Impact Assessment documentation following Art. 35 requirements.
# Get input template
python scripts/dpia_generator.py --template > input.json
# Generate DPIA report
python scripts/dpia_generator.py --input input.json --output dpia_report.md
Features:
- Automatic DPIA threshold assessment
- Risk identification based on processing characteristics
- Legal basis requirements documentation
- Mitigation recommendations
- Markdown report generation
DPIA Triggers Assessed:
- Systematic monitoring (Art. 35(3)(c))
- Large-scale special category data (Art. 35(3)(b))
- Automated decision-making (Art. 35(3)(a))
- WP29 high-risk criteria
Data Subject Rights Tracker
Manages data subject rights requests under GDPR Articles 15-22.
# Add new request
python scripts/data_subject_rights_tracker.py add \
--type access --subject "John Doe" --email "[email protected]"
# List all requests
python scripts/data_subject_rights_tracker.py list
# Update status
python scripts/data_subject_rights_tracker.py status --id DSR-202601-0001 --update verified
# Generate compliance report
python scripts/data_subject_rights_tracker.py report --output compliance.json
# Generate response template
python scripts/data_subject_rights_tracker.py template --id DSR-202601-0001
Supported Rights:
| Right | Article | Deadline |
|---|---|---|
| Access | Art. 15 | 30 days |
| Rectification | Art. 16 | 30 days |
| Erasure | Art. 17 | 30 days |
| Restriction | Art. 18 | 30 days |
| Portability | Art. 20 | 30 days |
| Objection | Art. 21 | 30 days |
| Automated decisions | Art. 22 | 30 days |
Features:
- Deadline tracking with overdue alerts
- Identity verification workflow
- Response template generation
- Compliance reporting
Reference Guides
GDPR Compliance Guide
references/gdpr_compliance_guide.md
Comprehensive implementation guidance covering:
- Legal bases for processing (Art. 6)
- Special category requirements (Art. 9)
- Data subject rights implementation
- Accountability requirements (Art. 30)
- International transfers (Chapter V)
- Breach notification (Art. 33-34)
German BDSG Requirements
references/german_bdsg_requirements.md
German-specific requirements including:
- DPO appointment threshold (§ 38 BDSG - 20+ employees)
- Employment data processing (§ 26 BDSG)
- Video surveillance rules (§ 4 BDSG)
- Credit scoring requirements (§ 31 BDSG)
- State data protection laws (Landesdatenschutzgesetze)
- Works council co-determination rights
DPIA Methodology
references/dpia_methodology.md
Step-by-step DPIA process:
- Threshold assessment criteria
- WP29 high-risk indicators
- Risk assessment methodology
- Mitigation measure categories
- DPO and supervisory authority consultation
- Templates and checklists
Workflows
Workflow 1: New Processing Activity Assessment
Step 1: Run compliance checker on codebase
→ python scripts/gdpr_compliance_checker.py /path/to/code
Step 2: Review findings and compliance score
→ Address critical and high issues
Step 3: Determine if DPIA required
→ Check references/dpia_methodology.md threshold criteria
Step 4: If DPIA required, generate assessment
→ python scripts/dpia_generator.py --template > input.json
→ Fill in processing details
→ python scripts/dpia_generator.py --input input.json --output dpia.md
Step 5: Document in records of processing activities
Workflow 2: Data Subject Request Handling
Step 1: Log request in tracker
→ python scripts/data_subject_rights_tracker.py add --type [type] ...
Step 2: Verify identity (proportionate measures)
→ python scripts/data_subject_rights_tracker.py status --id [ID] --update verified
Step 3: Gather data from systems
→ python scripts/data_subject_rights_tracker.py status --id [ID] --update in_progress
Step 4: Generate response
→ python scripts/data_subject_rights_tracker.py template --id [ID]
Step 5: Send response and complete
→ python scripts/data_subject_rights_tracker.py status --id [ID] --update completed
Step 6: Monitor compliance
→ python scripts/data_subject_rights_tracker.py report
Workflow 3: German BDSG Compliance Check
Step 1: Determine if DPO required
→ 20+ employees processing personal data automatically
→ OR processing requires DPIA
→ OR business involves data transfer/market research
Step 2: If employees involved, review § 26 BDSG
→ Document legal basis for employee data
→ Check works council requirements
Step 3: If video surveillance, comply with § 4 BDSG
→ Install signage
→ Document necessity
→ Limit retention
Step 4: Register DPO with supervisory authority
→ See references/german_bdsg_requirements.md for authority list
Key GDPR Concepts
Legal Bases (Art. 6)
- Consent: Marketing, newsletters, analytics (must be freely given, specific, informed)
- Contract: Order fulfillment, service delivery
- Legal obligation: Tax records, employment law
- Legitimate interests: Fraud prevention, security (requires balancing test)
Special Category Data (Art. 9)
Requires explicit consent or Art. 9(2) exception:
- Health data
- Biometric data
- Racial/ethnic origin
- Political opinions
- Religious beliefs
- Trade union membership
- Genetic data
- Sexual orientation
Data Subject Rights
All rights must be fulfilled within 30 days (extendable to 90 for complex requests):
- Access: Provide copy of data and processing information
- Rectification: Correct inaccurate data
- Erasure: Delete data (with exceptions for legal obligations)
- Restriction: Limit processing while issues are resolved
- Portability: Provide data in machine-readable format
- Object: Stop processing based on legitimate interests
German BDSG Additions
| Topic | BDSG Section | Key Requirement |
|---|---|---|
| DPO threshold | § 38 | 20+ employees = mandatory DPO |
| Employment | § 26 | Detailed employee data rules |
| Video | § 4 | Signage and proportionality |
| Scoring | § 31 | Explainable algorithms |
DPIA Methodology
Data Protection Impact Assessment process, criteria, and checklists following GDPR Article 35 and WP29 guidelines.
Table of Contents
- When DPIA is Required
- DPIA Process
- Risk Assessment
- Consultation Requirements
- Templates and Checklists
When DPIA is Required
Mandatory DPIA Triggers (Art. 35(3))
A DPIA is always required for:
Systematic and extensive evaluation of personal aspects (profiling) with legal/significant effects
Large-scale processing of special category data (Art. 9) or criminal conviction data (Art. 10)
Systematic monitoring of publicly accessible areas on a large scale
WP29 High-Risk Criteria
DPIA likely required if processing involves two or more criteria:
| # | Criterion | Examples |
|---|---|---|
| 1 | Evaluation or scoring | Credit scoring, behavioral profiling |
| 2 | Automated decision-making with legal effects | Auto-reject job applications |
| 3 | Systematic monitoring | Employee monitoring, CCTV |
| 4 | Sensitive data | Health, biometric, religion |
| 5 | Large scale | City-wide surveillance, national database |
| 6 | Data matching/combining | Cross-referencing datasets |
| 7 | Vulnerable subjects | Children, patients, employees |
| 8 | Innovative technology | AI, IoT, biometrics |
| 9 | Data transfer outside EU | Cloud services in third countries |
| 10 | Blocking access to service | Credit blacklisting |
DPIA Not Required When
- Processing unlikely to result in high risk
- Similar processing already assessed
- Legal basis in EU/Member State law with DPIA done during legislative process
- Processing on supervisory authority's exemption list
Threshold Assessment Workflow
1. Is processing on supervisory authority's mandatory list?
→ YES: DPIA required
→ NO: Continue
2. Is processing covered by Art. 35(3) mandatory categories?
→ YES: DPIA required
→ NO: Continue
3. Does processing meet 2+ WP29 criteria?
→ YES: DPIA required
→ NO: Continue
4. Could processing result in high risk to individuals?
→ YES: DPIA recommended
→ NO: Document reasoning, no DPIA neededDPIA Process
Phase 1: Preparation
Step 1.1: Identify Need
- Complete threshold assessment
- Document decision rationale
- If DPIA needed, proceed
Step 1.2: Assemble Team
- Project/product owner
- IT/security representative
- Legal/compliance
- DPO consultation
- Subject matter experts as needed
Step 1.3: Gather Information
- Data flow diagrams
- Technical specifications
- Processing purposes
- Legal basis documentation
Phase 2: Description of Processing
Step 2.1: Document Scope
| Element | Description |
|---|---|
| Nature | How data is collected, used, stored, deleted |
| Scope | Categories of data, volume, frequency |
| Context | Relationship with subjects, expectations |
| Purposes | What processing achieves, why necessary |
Step 2.2: Map Data Flows
Document:
- Data sources (from subject, third parties, public)
- Collection methods (forms, APIs, automatic)
- Storage locations (databases, cloud, backups)
- Processing operations (analysis, sharing, profiling)
- Recipients (internal teams, processors, third parties)
- Retention and deletion
Step 2.3: Identify Legal Basis
For each processing purpose:
- Primary legal basis (Art. 6)
- Special category basis if applicable (Art. 9)
- Documentation of legitimate interests balance (if Art. 6(1)(f))
Phase 3: Necessity and Proportionality
Step 3.1: Necessity Assessment
Questions to answer:
- Is this processing necessary for the stated purpose?
- Could the purpose be achieved with less data?
- Could the purpose be achieved without this processing?
- Are there less intrusive alternatives?
Step 3.2: Proportionality Assessment
Evaluate:
- Data minimization compliance
- Purpose limitation compliance
- Storage limitation compliance
- Balance between controller needs and subject rights
Step 3.3: Data Protection Principles Compliance
| Principle | Assessment Question |
|---|---|
| Lawfulness | Is there a valid legal basis? |
| Fairness | Would subjects expect this processing? |
| Transparency | Are subjects properly informed? |
| Purpose limitation | Is processing limited to stated purposes? |
| Data minimization | Is only necessary data processed? |
| Accuracy | Are there mechanisms for keeping data accurate? |
| Storage limitation | Are retention periods defined and enforced? |
| Integrity/confidentiality | Are appropriate security measures in place? |
| Accountability | Can compliance be demonstrated? |
Phase 4: Risk Assessment
Step 4.1: Identify Risks
Risk categories to consider:
- Unauthorized access or disclosure
- Unlawful destruction or loss
- Unlawful modification
- Denial of service to subjects
- Discrimination or unfair decisions
- Financial loss to subjects
- Reputational damage to subjects
- Physical harm
- Psychological harm
Step 4.2: Assess Likelihood and Severity
| Level | Likelihood | Severity |
|---|---|---|
| Low | Unlikely to occur | Minimal impact, easily remedied |
| Medium | May occur occasionally | Significant inconvenience |
| High | Likely to occur | Serious impact on daily life |
| Very High | Expected to occur | Irreversible or very difficult to overcome |
Step 4.3: Risk Matrix
SEVERITY
Low Med High V.High
L Low [L] [L] [M] [M]
i Medium [L] [M] [H] [H]
k High [M] [H] [H] [VH]
e V.High [M] [H] [VH] [VH]Phase 5: Risk Mitigation
Step 5.1: Identify Measures
For each identified risk:
- Technical measures (encryption, access controls)
- Organizational measures (policies, training)
- Contractual measures (DPAs, liability clauses)
- Physical measures (building security)
Step 5.2: Evaluate Residual Risk
After mitigations:
- Re-assess likelihood
- Re-assess severity
- Determine if residual risk is acceptable
Step 5.3: Accept or Escalate
| Residual Risk | Action |
|---|---|
| Low/Medium | Document acceptance, proceed |
| High | Implement additional mitigations or consult DPO |
| Very High | Consult supervisory authority before proceeding |
Phase 6: Documentation and Review
Step 6.1: Document DPIA
Required content:
- Processing description
- Necessity and proportionality assessment
- Risk assessment
- Measures to address risks
- DPO advice
- Data subject views (if obtained)
Step 6.2: DPO Sign-Off
DPO should:
- Review DPIA completeness
- Verify risk assessment adequacy
- Confirm mitigation appropriateness
- Document advice given
Step 6.3: Schedule Review
Review DPIA when:
- Processing changes significantly
- New risks emerge
- Annually (minimum)
- After incidents
Risk Assessment
Common Risks by Processing Type
Profiling and Automated Decisions:
- Discrimination
- Inaccurate inferences
- Lack of transparency
- Denial of services
Large Scale Processing:
- Data breach impact
- Difficulty ensuring accuracy
- Challenge managing subject rights
- Aggregation effects
Sensitive Data:
- Social stigma
- Employment discrimination
- Insurance denial
- Relationship damage
New Technologies:
- Unknown vulnerabilities
- Lack of proven safeguards
- Regulatory uncertainty
- Subject unfamiliarity
Mitigation Measure Categories
Technical Measures:
- Encryption (at rest, in transit)
- Pseudonymization
- Anonymization where possible
- Access controls (RBAC)
- Audit logging
- Automated retention enforcement
- Data loss prevention
Organizational Measures:
- Privacy policies
- Staff training
- Access management procedures
- Incident response procedures
- Vendor management
- Regular audits
Transparency Measures:
- Clear privacy notices
- Layered information
- Just-in-time notices
- Easy rights exercise
Consultation Requirements
DPO Consultation (Art. 35(2))
When: During DPIA process
DPO role:
- Advise on whether DPIA is needed
- Advise on methodology
- Review assessment
- Monitor implementation
Data Subject Views (Art. 35(9))
When: Where appropriate
Methods:
- Surveys
- Focus groups
- Public consultation
- User testing
Not required if:
- Disproportionate effort
- Confidential commercial activity
- Would prejudice security
Supervisory Authority Consultation (Art. 36)
Required when:
- Residual risk remains high after mitigations
- Controller cannot sufficiently reduce risk
Process:
- Submit DPIA to authority
- Include information on controller/processor responsibilities
- Authority responds within 8 weeks (extendable to 14)
- Authority may prohibit processing or require changes
Templates and Checklists
DPIA Screening Checklist
Project Information:
- Project name documented
- Processing purposes defined
- Data categories identified
- Data subjects identified
Threshold Assessment:
- Checked against mandatory list
- Checked against Art. 35(3) criteria
- Counted WP29 criteria (need 2+)
- Decision documented with rationale
DPIA Content Checklist
Section 1: Processing Description
- Nature of processing described
- Scope defined (data, volume, geography)
- Context documented
- All purposes listed
- Data flows mapped
- Recipients identified
- Retention periods specified
Section 2: Legal Basis
- Legal basis identified for each purpose
- Special category basis documented (if applicable)
- Legitimate interests balance documented (if applicable)
- Consent mechanism described (if applicable)
Section 3: Necessity and Proportionality
- Necessity justified for each processing operation
- Alternatives considered and documented
- Data minimization demonstrated
- Proportionality assessment completed
Section 4: Risks
- All risk categories considered
- Likelihood assessed for each risk
- Severity assessed for each risk
- Overall risk level determined
Section 5: Mitigations
- Technical measures identified
- Organizational measures identified
- Residual risk assessed
- Acceptance or escalation determined
Section 6: Consultation
- DPO consulted
- DPO advice documented
- Data subject views considered (where appropriate)
- Supervisory authority consulted (if required)
Section 7: Sign-Off
- Project owner approval
- DPO sign-off
- Review date scheduled
Post-DPIA Actions
- Implement identified mitigations
- Update privacy notices if needed
- Update records of processing
- Schedule review date
- Monitor effectiveness of measures
- Document any changes to processing
GDPR Compliance Guide
Practical implementation guidance for EU General Data Protection Regulation compliance.
Table of Contents
- Legal Bases for Processing
- Data Subject Rights
- Accountability Requirements
- International Transfers
- Breach Notification
Legal Bases for Processing
Article 6 - Lawfulness of Processing
Processing is lawful only if at least one basis applies:
| Legal Basis | Article | When to Use |
|---|---|---|
| Consent | 6(1)(a) | Marketing, newsletters, cookies (non-essential) |
| Contract | 6(1)(b) | Fulfilling customer orders, employment contracts |
| Legal Obligation | 6(1)(c) | Tax records, employment law requirements |
| Vital Interests | 6(1)(d) | Medical emergencies (rarely used) |
| Public Interest | 6(1)(e) | Government functions, public health |
| Legitimate Interests | 6(1)(f) | Fraud prevention, network security, direct marketing (B2B) |
Consent Requirements (Art. 7)
Valid consent must be:
- Freely given: No imbalance of power, no bundling
- Specific: Separate consent for different purposes
- Informed: Clear information about processing
- Unambiguous: Clear affirmative action
- Withdrawable: Easy to withdraw as to give
Consent Checklist:
- Consent request is clear and plain language
- Separate from other terms and conditions
- Granular options for different processing purposes
- No pre-ticked boxes
- Record of when and how consent was given
- Easy withdrawal mechanism documented
- Consent refreshed periodically
Special Category Data (Art. 9)
Additional safeguards required for:
- Racial or ethnic origin
- Political opinions
- Religious or philosophical beliefs
- Trade union membership
- Genetic data
- Biometric data (for identification)
- Health data
- Sex life or sexual orientation
Processing Exceptions (Art. 9(2)):
- Explicit consent
- Employment/social security obligations
- Vital interests (subject incapable of consent)
- Legitimate activities of associations
- Data made public by subject
- Legal claims
- Substantial public interest
- Healthcare purposes
- Public health
- Archiving/research/statistics
Data Subject Rights
Right of Access (Art. 15)
What to provide:
- Confirmation of processing (yes/no)
- Copy of personal data
- Supplementary information:
- Purposes of processing
- Categories of data
- Recipients or categories
- Retention period or criteria
- Rights information
- Source of data
- Automated decision-making details
Process:
- Receive request (any form acceptable)
- Verify identity (proportionate measures)
- Gather data from all systems
- Provide response within 30 days
- First copy free; reasonable fee for additional
Right to Rectification (Art. 16)
When applicable:
- Data is inaccurate
- Data is incomplete
Process:
- Verify claimed inaccuracy
- Correct data in all systems
- Notify third parties of correction
- Respond within 30 days
Right to Erasure (Art. 17)
Grounds for erasure:
- Data no longer necessary for original purpose
- Consent withdrawn
- Objection to processing (no overriding grounds)
- Unlawful processing
- Legal obligation to erase
- Data collected from child for online services
Exceptions (erasure NOT required):
- Freedom of expression
- Legal obligation to retain
- Public health reasons
- Archiving in public interest
- Establishment/exercise/defense of legal claims
Right to Restriction (Art. 18)
Applicable when:
- Accuracy contested (during verification)
- Processing unlawful but erasure opposed
- Controller no longer needs data but subject needs for legal claims
- Objection pending verification of legitimate grounds
Effect: Data can only be stored; other processing requires consent
Right to Data Portability (Art. 20)
Requirements:
- Processing based on consent or contract
- Processing by automated means
Format: Structured, commonly used, machine-readable (JSON, CSV, XML)
Scope: Data provided by subject (not inferred or derived data)
Right to Object (Art. 21)
Processing based on legitimate interests/public interest:
- Subject can object at any time
- Controller must demonstrate compelling legitimate grounds
Direct marketing:
- Absolute right to object
- Processing must stop immediately
- Must inform subject of right at first communication
Automated Decision-Making (Art. 22)
Right not to be subject to decisions:
- Based solely on automated processing
- Producing legal or similarly significant effects
Exceptions:
- Necessary for contract
- Authorized by law
- Based on explicit consent
Safeguards required:
- Right to human intervention
- Right to express point of view
- Right to contest decision
Accountability Requirements
Records of Processing Activities (Art. 30)
Controller must record:
- Controller name and contact
- Purposes of processing
- Categories of data subjects
- Categories of personal data
- Categories of recipients
- Third country transfers and safeguards
- Retention periods
- Technical and organizational measures
Processor must record:
- Processor name and contact
- Categories of processing
- Third country transfers
- Technical and organizational measures
Data Protection by Design and Default (Art. 25)
By Design principles:
- Data minimization
- Pseudonymization
- Purpose limitation built into systems
- Security measures from inception
By Default requirements:
- Only necessary data processed
- Limited collection scope
- Limited storage period
- Limited accessibility
Data Protection Impact Assessment (Art. 35)
Required when:
- Systematic and extensive profiling with significant effects
- Large-scale processing of special categories
- Systematic monitoring of public areas
- Two or more high-risk criteria from WP29 guidelines
DPIA must contain:
- Systematic description of processing
- Assessment of necessity and proportionality
- Assessment of risks to rights and freedoms
- Measures to address risks
Data Processing Agreements (Art. 28)
Required clauses:
- Process only on documented instructions
- Confidentiality obligations
- Security measures
- Sub-processor requirements
- Assistance with subject rights
- Assistance with security obligations
- Return or delete data at end
- Audit rights
International Transfers
Adequacy Decisions (Art. 45)
Current adequate countries/territories:
- Andorra, Argentina, Canada (commercial), Faroe Islands
- Guernsey, Israel, Isle of Man, Japan, Jersey
- New Zealand, Republic of Korea, Switzerland
- UK, Uruguay
- EU-US Data Privacy Framework (participating companies)
Standard Contractual Clauses (Art. 46)
New SCCs (2021) modules:
- Module 1: Controller to Controller
- Module 2: Controller to Processor
- Module 3: Processor to Processor
- Module 4: Processor to Controller
Implementation requirements:
- Complete relevant modules
- Conduct Transfer Impact Assessment
- Implement supplementary measures if needed
- Document assessment
Transfer Impact Assessment
Assess:
- Circumstances of transfer
- Third country legal framework
- Contractual and technical safeguards
- Whether safeguards are effective
- Supplementary measures needed
Breach Notification
Supervisory Authority Notification (Art. 33)
Timeline: Within 72 hours of becoming aware
Required unless: Unlikely to result in risk to rights and freedoms
Notification must include:
- Nature of breach
- Categories and approximate numbers affected
- DPO contact details
- Likely consequences
- Measures taken or proposed
Data Subject Notification (Art. 34)
Required when: High risk to rights and freedoms
Not required if:
- Appropriate technical measures in place (encryption)
- Subsequent measures eliminate high risk
- Disproportionate effort (public communication instead)
Breach Documentation
Document ALL breaches:
- Facts of breach
- Effects
- Remedial action
- Justification for any non-notification
Compliance Checklist
Governance
- DPO appointed (if required)
- Data protection policies in place
- Staff training conducted
- Privacy by design implemented
Documentation
- Records of processing activities
- Privacy notices updated
- Consent records maintained
- DPIAs conducted where required
- Processor agreements in place
Technical Measures
- Encryption at rest and in transit
- Access controls implemented
- Audit logging enabled
- Data minimization applied
- Retention schedules automated
Subject Rights
- Access request process
- Erasure capability
- Portability capability
- Objection handling process
- Response within deadlines
German BDSG Requirements
German-specific data protection requirements under the Bundesdatenschutzgesetz (BDSG) and state laws.
Table of Contents
- BDSG Overview
- DPO Requirements
- Employment Data
- Video Surveillance
- Credit Scoring
- State Data Protection Laws
- German Supervisory Authorities
BDSG Overview
The Bundesdatenschutzgesetz (BDSG) supplements the GDPR with German-specific provisions under the opening clauses.
Key BDSG Additions to GDPR
| Topic | BDSG Section | GDPR Opening Clause |
|---|---|---|
| DPO appointment threshold | § 38 | Art. 37(4) |
| Employment data | § 26 | Art. 88 |
| Video surveillance | § 4 | Art. 6(1)(f) |
| Credit scoring | § 31 | Art. 22(2)(b) |
| Consumer credit | § 31 | Art. 22(2)(b) |
| Research processing | §§ 27-28 | Art. 89 |
| Special categories | § 22 | Art. 9(2)(g) |
BDSG Structure
- Part 1 (§§ 1-21): Common provisions
- Part 2 (§§ 22-44): Implementation of GDPR
- Part 3 (§§ 45-84): Implementation of Law Enforcement Directive
- Part 4 (§§ 85-91): Special provisions
DPO Requirements
Mandatory DPO Appointment (§ 38 BDSG)
A Data Protection Officer must be appointed when:
At least 20 employees are constantly engaged in automated processing of personal data
Processing requires DPIA under Art. 35 GDPR (regardless of employee count)
Business purpose involves personal data transfer or market research (regardless of employee count)
DPO Qualifications
Required qualifications:
- Professional knowledge of data protection law and practices
- Ability to fulfill tasks under Art. 39 GDPR
- No conflict of interest with other duties
Recommended qualifications:
- Certification (e.g., TÜV, DEKRA, GDD)
- Legal or IT background
- Understanding of business processes
DPO Independence (§ 38(2) BDSG)
- Cannot be dismissed for performing DPO duties
- Protection extends 1 year after end of appointment
- Entitled to resources and training
- Reports to highest management level
Employment Data
§ 26 BDSG - Processing of Employee Data
Lawful processing for employment purposes:
Establishment of employment (recruitment)
- CV processing
- Reference checks
- Background verification (limited scope)
Performance of employment contract
- Payroll processing
- Working time recording
- Performance evaluation
Termination of employment
- Exit interviews
- Reference provision
- Legal claims handling
Consent in Employment Context
Special requirements:
- Consent must be voluntary (difficult in employment relationship)
- Power imbalance must be considered
- Written or electronic form required
- Employee must receive copy
When consent may be valid:
- Additional voluntary benefits
- Photo publication (with genuine choice)
- Optional surveys
Employee Monitoring
Permitted (with justification):
- Email/internet monitoring (with policy and proportionality)
- GPS tracking of company vehicles (business use)
- CCTV in certain areas (not changing rooms, toilets)
- Time and attendance systems
Prohibited:
- Covert monitoring (except criminal investigation)
- Keystroke logging without notice
- Private communication interception
Works Council Rights
Under Betriebsverfassungsgesetz (BetrVG):
- Co-determination on technical monitoring systems (§ 87(1) No. 6)
- Information rights on data processing
- Must be consulted before implementation
Video Surveillance
§ 4 BDSG - Video Surveillance of Public Areas
Permitted for:
- Public authorities - for their tasks
- Private entities - for:
- Protection of property
- Exercising domiciliary rights
- Legitimate purposes (documented)
Requirements:
- Signage indicating surveillance
- Retention limited to purpose
- Regular review of necessity
- Access limited to authorized personnel
Technical Requirements
Signs must include:
- Fact of surveillance
- Controller identity
- Contact for rights exercise
Data retention:
- Delete when no longer necessary
- Typically maximum 72 hours
- Longer retention requires specific justification
Balancing Test Documentation
Document for each camera:
- Purpose served
- Alternatives considered
- Privacy impact
- Proportionality assessment
- Technical safeguards
Credit Scoring
§ 31 BDSG - Credit Information
Requirements for scoring:
- Scientifically recognized mathematical procedure
- Core elements must be explainable
- Not solely based on address data
Data subject rights:
- Information about score calculation (general logic)
- Factors that influenced score
- Right to explanation of decision
Creditworthiness Assessment
Permitted data sources:
- Payment history with data subject consent
- Public registers (Schuldnerverzeichnis)
- Credit reference agencies (Auskunfteien)
Prohibited practices:
- Social media profile analysis for credit decisions
- Using health data
- Processing special categories for scoring
Credit Reference Agencies (Auskunfteien)
Major agencies:
- SCHUFA Holding AG
- Creditreform
- infoscore Consumer Data GmbH
- Bürgel
Data subject rights with agencies:
- Free self-disclosure once per year
- Correction of inaccurate data
- Deletion after statutory periods
State Data Protection Laws
Landesdatenschutzgesetze (LDSG)
Each German state has its own data protection law for public bodies:
| State | Law | Supervisory Authority |
|---|---|---|
| Baden-Württemberg | LDSG BW | LfDI BW |
| Bayern | BayDSG | BayLDA |
| Berlin | BlnDSG | BlnBDI |
| Brandenburg | BbgDSG | LDA Brandenburg |
| Bremen | BremDSGVOAG | LfDI Bremen |
| Hamburg | HmbDSG | HmbBfDI |
| Hessen | HDSIG | HBDI |
| Mecklenburg-Vorpommern | DSG M-V | LfDI M-V |
| Niedersachsen | NDSG | LfD Niedersachsen |
| Nordrhein-Westfalen | DSG NRW | LDI NRW |
| Rheinland-Pfalz | LDSG RP | LfDI RP |
| Saarland | SDSG | ULD Saarland |
| Sachsen | SächsDSG | SächsDSB |
| Sachsen-Anhalt | DSG LSA | LfD LSA |
| Schleswig-Holstein | LDSG SH | ULD |
| Thüringen | ThürDSG | TLfDI |
Public vs Private Sector
Public sector (Länder laws apply):
- State government agencies
- State universities
- State healthcare facilities
- Municipalities
Private sector (BDSG applies):
- Private companies
- Associations
- Private healthcare providers
- Federal public bodies
German Supervisory Authorities
Federal Level
BfDI - Bundesbeauftragte für den Datenschutz und die Informationsfreiheit
- Responsible for federal public bodies
- Responsible for telecommunications and postal services
- Representative in EDPB
State Level Authorities
Competence:
- Private sector entities headquartered in the state
- State public bodies
Determining Competent Authority
For private sector:
- Identify main establishment location
- That state's DPA is lead authority
- Cross-border processing involves cooperation procedure
Fines and Enforcement
BDSG fine provisions (§ 41):
- Up to €50,000 for certain violations (supplement to GDPR)
- GDPR fines up to €20 million / 4% turnover apply
German enforcement characteristics:
- Generally cooperative approach first
- Written warnings common
- Fines increasing since GDPR
- Public naming of violators
Compliance Checklist for Germany
BDSG-Specific Requirements
- DPO appointed if 20+ employees process personal data
- DPO registered with supervisory authority
- Employee data processing documented under § 26
- Works council consultation completed (if applicable)
- Video surveillance signage in place
- Scoring procedures documented (if applicable)
Documentation Requirements
- Records of processing activities (German language)
- Employee data processing policies
- Video surveillance assessment
- Works council agreements
Supervisory Authority Engagement
- Competent authority identified
- DPO notification submitted
- Breach notification procedures in German
- Response procedures for authority inquiries
Key Differences from GDPR-Only Compliance
| Aspect | GDPR | German BDSG Addition |
|---|---|---|
| DPO threshold | Risk-based | 20+ employees |
| Employment data | Art. 88 opening clause | Detailed § 26 requirements |
| Video surveillance | Legitimate interests | Specific § 4 rules |
| Credit scoring | Art. 22 | Detailed § 31 requirements |
| Works council | Not addressed | Co-determination rights |
| Fines | Art. 83 | Additional § 41 fines |
#!/usr/bin/env python3
"""
Data Subject Rights Tracker
Tracks and manages data subject rights requests under GDPR Articles 15-22.
Monitors deadlines, generates response templates, and produces compliance reports.
Usage:
python data_subject_rights_tracker.py list
python data_subject_rights_tracker.py add --type access --subject "John Doe"
python data_subject_rights_tracker.py status --id REQ-001
python data_subject_rights_tracker.py report --output compliance_report.json
"""
import argparse
import json
import os
import sys
from datetime import datetime, timedelta
from pathlib import Path
from typing import Dict, List, Optional
from uuid import uuid4
# GDPR Articles for each right
RIGHTS_TYPES = {
"access": {
"article": "Art. 15",
"name": "Right of Access",
"deadline_days": 30,
"description": "Data subject has the right to obtain confirmation of processing and access to their data",
"response_includes": [
"Purposes of processing",
"Categories of personal data",
"Recipients or categories of recipients",
"Retention period or criteria",
"Right to lodge complaint",
"Source of data (if not collected from subject)",
"Existence of automated decision-making"
]
},
"rectification": {
"article": "Art. 16",
"name": "Right to Rectification",
"deadline_days": 30,
"description": "Data subject has the right to have inaccurate personal data corrected",
"response_includes": [
"Confirmation of correction",
"Details of corrected data",
"Notification to recipients"
]
},
"erasure": {
"article": "Art. 17",
"name": "Right to Erasure (Right to be Forgotten)",
"deadline_days": 30,
"description": "Data subject has the right to have their personal data erased",
"grounds": [
"Data no longer necessary for original purpose",
"Consent withdrawn",
"Objection to processing (no overriding grounds)",
"Unlawful processing",
"Legal obligation to erase",
"Data collected from child"
],
"exceptions": [
"Freedom of expression",
"Legal obligation to retain",
"Public health reasons",
"Archiving in public interest",
"Legal claims"
]
},
"restriction": {
"article": "Art. 18",
"name": "Right to Restriction of Processing",
"deadline_days": 30,
"description": "Data subject has the right to restrict processing of their data",
"grounds": [
"Accuracy contested (during verification)",
"Processing is unlawful (erasure opposed)",
"Controller no longer needs data (subject needs for legal claims)",
"Objection pending verification"
]
},
"portability": {
"article": "Art. 20",
"name": "Right to Data Portability",
"deadline_days": 30,
"description": "Data subject has the right to receive their data in a portable format",
"conditions": [
"Processing based on consent or contract",
"Processing carried out by automated means"
],
"format_requirements": [
"Structured format",
"Commonly used format",
"Machine-readable format"
]
},
"objection": {
"article": "Art. 21",
"name": "Right to Object",
"deadline_days": 30,
"description": "Data subject has the right to object to processing",
"applies_to": [
"Processing based on legitimate interests",
"Processing for direct marketing",
"Processing for research/statistics"
]
},
"automated": {
"article": "Art. 22",
"name": "Rights Related to Automated Decision-Making",
"deadline_days": 30,
"description": "Data subject has the right not to be subject to solely automated decisions",
"includes": [
"Right to human intervention",
"Right to express point of view",
"Right to contest decision"
]
}
}
# Request statuses
STATUSES = {
"received": "Request received, pending identity verification",
"verified": "Identity verified, processing request",
"in_progress": "Gathering data / processing request",
"pending_info": "Awaiting additional information from subject",
"extended": "Deadline extended (complex request)",
"completed": "Request completed and response sent",
"refused": "Request refused (with justification)",
"escalated": "Escalated to DPO/legal"
}
class RightsTracker:
"""Manages data subject rights requests."""
def __init__(self, data_file: str = "dsr_requests.json"):
self.data_file = Path(data_file)
self.requests = self._load_requests()
def _load_requests(self) -> Dict:
"""Load requests from file."""
if self.data_file.exists():
with open(self.data_file, "r") as f:
return json.load(f)
return {"requests": [], "metadata": {"created": datetime.now().isoformat()}}
def _save_requests(self):
"""Save requests to file."""
self.requests["metadata"]["updated"] = datetime.now().isoformat()
with open(self.data_file, "w") as f:
json.dump(self.requests, f, indent=2)
def _generate_id(self) -> str:
"""Generate unique request ID."""
count = len(self.requests["requests"]) + 1
return f"DSR-{datetime.now().strftime('%Y%m')}-{count:04d}"
def add_request(
self,
right_type: str,
subject_name: str,
subject_email: str,
details: str = ""
) -> Dict:
"""Add a new data subject request."""
if right_type not in RIGHTS_TYPES:
raise ValueError(f"Invalid right type. Must be one of: {list(RIGHTS_TYPES.keys())}")
right_info = RIGHTS_TYPES[right_type]
now = datetime.now()
deadline = now + timedelta(days=right_info["deadline_days"])
request = {
"id": self._generate_id(),
"type": right_type,
"article": right_info["article"],
"right_name": right_info["name"],
"subject": {
"name": subject_name,
"email": subject_email,
"verified": False
},
"details": details,
"status": "received",
"status_description": STATUSES["received"],
"dates": {
"received": now.isoformat(),
"deadline": deadline.isoformat(),
"verified": None,
"completed": None
},
"notes": [],
"response": None
}
self.requests["requests"].append(request)
self._save_requests()
return request
def update_status(
self,
request_id: str,
new_status: str,
note: str = ""
) -> Optional[Dict]:
"""Update request status."""
if new_status not in STATUSES:
raise ValueError(f"Invalid status. Must be one of: {list(STATUSES.keys())}")
for req in self.requests["requests"]:
if req["id"] == request_id:
req["status"] = new_status
req["status_description"] = STATUSES[new_status]
if new_status == "verified":
req["subject"]["verified"] = True
req["dates"]["verified"] = datetime.now().isoformat()
elif new_status == "completed":
req["dates"]["completed"] = datetime.now().isoformat()
elif new_status == "extended":
# Extend deadline by additional 60 days (max total 90)
original_deadline = datetime.fromisoformat(req["dates"]["deadline"])
req["dates"]["deadline"] = (original_deadline + timedelta(days=60)).isoformat()
if note:
req["notes"].append({
"timestamp": datetime.now().isoformat(),
"note": note
})
self._save_requests()
return req
return None
def get_request(self, request_id: str) -> Optional[Dict]:
"""Get request by ID."""
for req in self.requests["requests"]:
if req["id"] == request_id:
return req
return None
def list_requests(
self,
status_filter: Optional[str] = None,
overdue_only: bool = False
) -> List[Dict]:
"""List requests with optional filtering."""
results = []
now = datetime.now()
for req in self.requests["requests"]:
if status_filter and req["status"] != status_filter:
continue
deadline = datetime.fromisoformat(req["dates"]["deadline"])
is_overdue = deadline < now and req["status"] not in ["completed", "refused"]
if overdue_only and not is_overdue:
continue
req_summary = {
**req,
"is_overdue": is_overdue,
"days_remaining": (deadline - now).days if not is_overdue else 0
}
results.append(req_summary)
return results
def generate_report(self) -> Dict:
"""Generate compliance report."""
now = datetime.now()
total = len(self.requests["requests"])
status_counts = {}
for status in STATUSES:
status_counts[status] = sum(1 for r in self.requests["requests"] if r["status"] == status)
type_counts = {}
for right_type in RIGHTS_TYPES:
type_counts[right_type] = sum(1 for r in self.requests["requests"] if r["type"] == right_type)
overdue = []
completed_on_time = 0
completed_late = 0
for req in self.requests["requests"]:
deadline = datetime.fromisoformat(req["dates"]["deadline"])
if req["status"] in ["completed", "refused"]:
completed_date = datetime.fromisoformat(req["dates"]["completed"])
if completed_date <= deadline:
completed_on_time += 1
else:
completed_late += 1
elif deadline < now:
overdue.append({
"id": req["id"],
"type": req["type"],
"subject": req["subject"]["name"],
"days_overdue": (now - deadline).days
})
compliance_rate = (completed_on_time / (completed_on_time + completed_late) * 100) if (completed_on_time + completed_late) > 0 else 100
return {
"report_date": now.isoformat(),
"summary": {
"total_requests": total,
"open_requests": total - status_counts.get("completed", 0) - status_counts.get("refused", 0),
"overdue_requests": len(overdue),
"compliance_rate": round(compliance_rate, 1)
},
"by_status": status_counts,
"by_type": type_counts,
"overdue_details": overdue,
"performance": {
"completed_on_time": completed_on_time,
"completed_late": completed_late,
"average_response_days": self._calculate_avg_response_time()
}
}
def _calculate_avg_response_time(self) -> float:
"""Calculate average response time for completed requests."""
response_times = []
for req in self.requests["requests"]:
if req["status"] == "completed" and req["dates"]["completed"]:
received = datetime.fromisoformat(req["dates"]["received"])
completed = datetime.fromisoformat(req["dates"]["completed"])
response_times.append((completed - received).days)
return round(sum(response_times) / len(response_times), 1) if response_times else 0
def generate_response_template(self, request_id: str) -> Optional[str]:
"""Generate response template for a request."""
req = self.get_request(request_id)
if not req:
return None
right_info = RIGHTS_TYPES.get(req["type"], {})
template = f"""
Subject: Response to Your {right_info.get('name', 'Data Subject')} Request ({req['id']})
Dear {req['subject']['name']},
Thank you for your request dated {req['dates']['received'][:10]} exercising your {right_info.get('name', 'data protection right')} under {right_info.get('article', 'GDPR')}.
We have processed your request and respond as follows:
[RESPONSE DETAILS HERE]
"""
if req["type"] == "access":
template += """
As required under Article 15, we provide the following information:
1. Purposes of Processing:
[List purposes]
2. Categories of Personal Data:
[List categories]
3. Recipients:
[List recipients or categories]
4. Retention Period:
[Specify period or criteria]
5. Your Rights:
- Right to rectification (Art. 16)
- Right to erasure (Art. 17)
- Right to restriction (Art. 18)
- Right to object (Art. 21)
- Right to lodge complaint with supervisory authority
6. Source of Data:
[Specify if not collected from you directly]
7. Automated Decision-Making:
[Confirm if applicable and provide meaningful information]
Enclosed: Copy of your personal data
"""
elif req["type"] == "erasure":
template += """
We confirm that your personal data has been erased from our systems, except where:
- We are legally required to retain it
- It is necessary for legal claims
- [Other applicable exceptions]
We have also notified the following recipients of the erasure:
[List recipients]
"""
elif req["type"] == "portability":
template += """
Please find attached your personal data in [JSON/CSV] format.
This includes all data:
- Provided by you
- Processed based on your consent or contract
- Processed by automated means
You may transmit this data to another controller or request direct transmission where technically feasible.
"""
template += f"""
If you have any questions about this response, please contact our Data Protection Officer at [DPO EMAIL].
If you are not satisfied with our response, you have the right to lodge a complaint with the supervisory authority:
[SUPERVISORY AUTHORITY DETAILS]
Yours sincerely,
[CONTROLLER NAME]
Data Protection Team
Reference: {req['id']}
"""
return template
def main():
parser = argparse.ArgumentParser(
description="Track and manage data subject rights requests"
)
parser.add_argument(
"--data-file",
default="dsr_requests.json",
help="Path to requests data file (default: dsr_requests.json)"
)
subparsers = parser.add_subparsers(dest="command", help="Commands")
# Add command
add_parser = subparsers.add_parser("add", help="Add new request")
add_parser.add_argument("--type", "-t", required=True, choices=RIGHTS_TYPES.keys())
add_parser.add_argument("--subject", "-s", required=True, help="Subject name")
add_parser.add_argument("--email", "-e", required=True, help="Subject email")
add_parser.add_argument("--details", "-d", default="", help="Request details")
# List command
list_parser = subparsers.add_parser("list", help="List requests")
list_parser.add_argument("--status", choices=STATUSES.keys(), help="Filter by status")
list_parser.add_argument("--overdue", action="store_true", help="Show only overdue")
list_parser.add_argument("--json", action="store_true", help="JSON output")
# Status command
status_parser = subparsers.add_parser("status", help="Get/update request status")
status_parser.add_argument("--id", required=True, help="Request ID")
status_parser.add_argument("--update", choices=STATUSES.keys(), help="Update status")
status_parser.add_argument("--note", default="", help="Add note")
# Report command
report_parser = subparsers.add_parser("report", help="Generate compliance report")
report_parser.add_argument("--output", "-o", help="Output file")
# Template command
template_parser = subparsers.add_parser("template", help="Generate response template")
template_parser.add_argument("--id", required=True, help="Request ID")
# Types command
subparsers.add_parser("types", help="List available request types")
args = parser.parse_args()
tracker = RightsTracker(args.data_file)
if args.command == "add":
request = tracker.add_request(
args.type, args.subject, args.email, args.details
)
print(f"Request created: {request['id']}")
print(f"Type: {request['right_name']} ({request['article']})")
print(f"Deadline: {request['dates']['deadline'][:10]}")
elif args.command == "list":
requests = tracker.list_requests(args.status, args.overdue)
if args.json:
print(json.dumps(requests, indent=2))
else:
if not requests:
print("No requests found.")
return
print(f"{'ID':<20} {'Type':<15} {'Subject':<20} {'Status':<15} {'Deadline':<12} {'Overdue'}")
print("-" * 95)
for req in requests:
overdue_flag = "YES" if req.get("is_overdue") else ""
print(f"{req['id']:<20} {req['type']:<15} {req['subject']['name'][:20]:<20} {req['status']:<15} {req['dates']['deadline'][:10]:<12} {overdue_flag}")
elif args.command == "status":
if args.update:
req = tracker.update_status(args.id, args.update, args.note)
if req:
print(f"Updated {args.id} to status: {args.update}")
else:
print(f"Request not found: {args.id}")
else:
req = tracker.get_request(args.id)
if req:
print(json.dumps(req, indent=2))
else:
print(f"Request not found: {args.id}")
elif args.command == "report":
report = tracker.generate_report()
output = json.dumps(report, indent=2)
if args.output:
with open(args.output, "w") as f:
f.write(output)
print(f"Report written to {args.output}")
else:
print(output)
elif args.command == "template":
template = tracker.generate_response_template(args.id)
if template:
print(template)
else:
print(f"Request not found: {args.id}")
elif args.command == "types":
print("Available Request Types:")
print("-" * 60)
for key, info in RIGHTS_TYPES.items():
print(f"\n{key} ({info['article']})")
print(f" {info['name']}")
print(f" Deadline: {info['deadline_days']} days")
else:
parser.print_help()
if __name__ == "__main__":
main()
#!/usr/bin/env python3
"""
DPIA Generator
Generates Data Protection Impact Assessment documentation based on
processing activity inputs. Creates structured DPIA reports following
GDPR Article 35 requirements.
Usage:
python dpia_generator.py --interactive
python dpia_generator.py --input processing_activity.json --output dpia_report.md
python dpia_generator.py --template > template.json
"""
import argparse
import json
import sys
from datetime import datetime
from pathlib import Path
from typing import Dict, List, Optional
# DPIA threshold criteria (Art. 35(3) and WP29 Guidelines)
DPIA_TRIGGERS = {
"systematic_monitoring": {
"description": "Systematic monitoring of publicly accessible area",
"article": "Art. 35(3)(c)",
"weight": 10
},
"large_scale_special_category": {
"description": "Large-scale processing of special category data (Art. 9)",
"article": "Art. 35(3)(b)",
"weight": 10
},
"automated_decision_making": {
"description": "Automated decision-making with legal/significant effects",
"article": "Art. 35(3)(a)",
"weight": 10
},
"evaluation_scoring": {
"description": "Evaluation or scoring of individuals",
"article": "WP29 Guidelines",
"weight": 7
},
"sensitive_data": {
"description": "Processing of sensitive data or highly personal data",
"article": "WP29 Guidelines",
"weight": 7
},
"large_scale": {
"description": "Data processed on a large scale",
"article": "WP29 Guidelines",
"weight": 6
},
"data_matching": {
"description": "Matching or combining datasets",
"article": "WP29 Guidelines",
"weight": 5
},
"vulnerable_subjects": {
"description": "Data concerning vulnerable data subjects",
"article": "WP29 Guidelines",
"weight": 7
},
"innovative_technology": {
"description": "Innovative use or applying new technological solutions",
"article": "WP29 Guidelines",
"weight": 5
},
"cross_border_transfer": {
"description": "Transfer of data outside the EU/EEA",
"article": "GDPR Chapter V",
"weight": 5
}
}
# Risk categories and mitigation measures
RISK_CATEGORIES = {
"unauthorized_access": {
"description": "Risk of unauthorized access to personal data",
"impact": "high",
"mitigations": [
"Implement access controls and authentication",
"Use encryption for data at rest and in transit",
"Maintain audit logs of access",
"Implement least privilege principle"
]
},
"data_breach": {
"description": "Risk of data breach or unauthorized disclosure",
"impact": "high",
"mitigations": [
"Implement intrusion detection systems",
"Establish incident response procedures",
"Regular security assessments",
"Employee security training"
]
},
"excessive_collection": {
"description": "Risk of collecting more data than necessary",
"impact": "medium",
"mitigations": [
"Implement data minimization principles",
"Regular review of data collected",
"Privacy by design approach",
"Document purpose for each data element"
]
},
"purpose_creep": {
"description": "Risk of using data for purposes beyond original scope",
"impact": "medium",
"mitigations": [
"Clear purpose limitation policies",
"Consent management for new purposes",
"Technical controls on data access",
"Regular purpose review"
]
},
"retention_violation": {
"description": "Risk of retaining data longer than necessary",
"impact": "medium",
"mitigations": [
"Implement retention schedules",
"Automated deletion processes",
"Regular data inventory audits",
"Document retention justification"
]
},
"rights_violation": {
"description": "Risk of failing to fulfill data subject rights",
"impact": "high",
"mitigations": [
"Implement subject access request process",
"Technical capability for data portability",
"Deletion/erasure procedures",
"Staff training on rights requests"
]
},
"inaccurate_data": {
"description": "Risk of processing inaccurate or outdated data",
"impact": "medium",
"mitigations": [
"Data quality checks at collection",
"Regular data verification",
"Easy update mechanisms for subjects",
"Automated accuracy validation"
]
},
"third_party_risk": {
"description": "Risk from third-party processors",
"impact": "high",
"mitigations": [
"Due diligence on processors",
"Data Processing Agreements",
"Regular processor audits",
"Clear processor instructions"
]
}
}
# Legal bases under Article 6
LEGAL_BASES = {
"consent": {
"article": "Art. 6(1)(a)",
"description": "Data subject has given consent",
"requirements": [
"Consent must be freely given",
"Specific to the purpose",
"Informed consent with clear information",
"Unambiguous indication of wishes",
"Easy to withdraw"
]
},
"contract": {
"article": "Art. 6(1)(b)",
"description": "Processing necessary for contract performance",
"requirements": [
"Contract must exist or be in negotiation",
"Processing must be necessary for the contract",
"Cannot process more than contractually needed"
]
},
"legal_obligation": {
"article": "Art. 6(1)(c)",
"description": "Processing necessary for legal obligation",
"requirements": [
"Legal obligation must be binding",
"Must be EU or Member State law",
"Processing must be necessary to comply"
]
},
"vital_interests": {
"article": "Art. 6(1)(d)",
"description": "Processing necessary to protect vital interests",
"requirements": [
"Life-threatening situation",
"No other legal basis available",
"Typically emergency situations"
]
},
"public_interest": {
"article": "Art. 6(1)(e)",
"description": "Processing necessary for public interest task",
"requirements": [
"Task in public interest or official authority",
"Legal basis in EU or Member State law",
"Processing must be necessary"
]
},
"legitimate_interests": {
"article": "Art. 6(1)(f)",
"description": "Processing necessary for legitimate interests",
"requirements": [
"Identify the legitimate interest",
"Show processing is necessary",
"Balance against data subject rights",
"Not available for public authorities"
]
}
}
def get_template() -> Dict:
"""Return a blank DPIA input template."""
return {
"project_name": "",
"version": "1.0",
"date": datetime.now().strftime("%Y-%m-%d"),
"controller": {
"name": "",
"contact": "",
"dpo_contact": ""
},
"processing_activity": {
"description": "",
"purposes": [],
"legal_basis": "",
"legal_basis_justification": ""
},
"data_subjects": {
"categories": [],
"estimated_number": "",
"vulnerable_groups": False,
"vulnerable_groups_details": ""
},
"personal_data": {
"categories": [],
"special_categories": [],
"source": "",
"retention_period": ""
},
"processing_operations": {
"collection_method": "",
"storage_location": "",
"access_controls": "",
"automated_decisions": False,
"profiling": False
},
"data_recipients": {
"internal": [],
"external_processors": [],
"third_countries": []
},
"dpia_triggers": [],
"identified_risks": [],
"mitigations_planned": []
}
def assess_dpia_requirement(input_data: Dict) -> Dict:
"""Assess whether DPIA is required based on triggers."""
triggers_present = input_data.get("dpia_triggers", [])
total_weight = 0
triggered_criteria = []
for trigger in triggers_present:
if trigger in DPIA_TRIGGERS:
trigger_info = DPIA_TRIGGERS[trigger]
total_weight += trigger_info["weight"]
triggered_criteria.append({
"trigger": trigger,
"description": trigger_info["description"],
"article": trigger_info["article"]
})
# Also check data characteristics
if input_data.get("data_subjects", {}).get("vulnerable_groups"):
if "vulnerable_subjects" not in triggers_present:
total_weight += DPIA_TRIGGERS["vulnerable_subjects"]["weight"]
triggered_criteria.append({
"trigger": "vulnerable_subjects",
"description": DPIA_TRIGGERS["vulnerable_subjects"]["description"],
"article": DPIA_TRIGGERS["vulnerable_subjects"]["article"]
})
if input_data.get("personal_data", {}).get("special_categories"):
if "sensitive_data" not in triggers_present:
total_weight += DPIA_TRIGGERS["sensitive_data"]["weight"]
triggered_criteria.append({
"trigger": "sensitive_data",
"description": DPIA_TRIGGERS["sensitive_data"]["description"],
"article": DPIA_TRIGGERS["sensitive_data"]["article"]
})
if input_data.get("data_recipients", {}).get("third_countries"):
if "cross_border_transfer" not in triggers_present:
total_weight += DPIA_TRIGGERS["cross_border_transfer"]["weight"]
triggered_criteria.append({
"trigger": "cross_border_transfer",
"description": DPIA_TRIGGERS["cross_border_transfer"]["description"],
"article": DPIA_TRIGGERS["cross_border_transfer"]["article"]
})
# DPIA required if 2+ triggers or weight >= 10
dpia_required = len(triggered_criteria) >= 2 or total_weight >= 10
return {
"dpia_required": dpia_required,
"risk_score": total_weight,
"triggered_criteria": triggered_criteria,
"recommendation": "DPIA is mandatory" if dpia_required else "DPIA recommended as best practice"
}
def assess_risks(input_data: Dict) -> List[Dict]:
"""Assess risks based on processing characteristics."""
risks = []
# Check each risk category
processing = input_data.get("processing_operations", {})
recipients = input_data.get("data_recipients", {})
personal_data = input_data.get("personal_data", {})
# Unauthorized access risk
if processing.get("storage_location") or processing.get("collection_method"):
risks.append({
**RISK_CATEGORIES["unauthorized_access"],
"likelihood": "medium",
"residual_risk": "low" if processing.get("access_controls") else "medium"
})
# Data breach risk (always present)
risks.append({
**RISK_CATEGORIES["data_breach"],
"likelihood": "medium",
"residual_risk": "medium"
})
# Third party risk
if recipients.get("external_processors") or recipients.get("third_countries"):
risks.append({
**RISK_CATEGORIES["third_party_risk"],
"likelihood": "medium",
"residual_risk": "medium"
})
# Rights violation risk
risks.append({
**RISK_CATEGORIES["rights_violation"],
"likelihood": "low",
"residual_risk": "low"
})
# Retention violation risk
if not personal_data.get("retention_period"):
risks.append({
**RISK_CATEGORIES["retention_violation"],
"likelihood": "high",
"residual_risk": "high"
})
# Automated decision risk
if processing.get("automated_decisions") or processing.get("profiling"):
risks.append({
"description": "Risk of unfair automated decisions affecting individuals",
"impact": "high",
"likelihood": "medium",
"residual_risk": "medium",
"mitigations": [
"Human review of automated decisions",
"Transparency about logic involved",
"Right to contest decisions",
"Regular algorithm audits"
]
})
return risks
def generate_dpia_report(input_data: Dict) -> str:
"""Generate DPIA report in Markdown format."""
requirement = assess_dpia_requirement(input_data)
risks = assess_risks(input_data)
project = input_data.get("project_name", "Unnamed Project")
controller = input_data.get("controller", {})
processing = input_data.get("processing_activity", {})
subjects = input_data.get("data_subjects", {})
personal_data = input_data.get("personal_data", {})
operations = input_data.get("processing_operations", {})
recipients = input_data.get("data_recipients", {})
legal_basis = processing.get("legal_basis", "")
legal_info = LEGAL_BASES.get(legal_basis, {})
report = f"""# Data Protection Impact Assessment (DPIA)
## Project: {project}
| Field | Value |
|-------|-------|
| Version | {input_data.get('version', '1.0')} |
| Date | {input_data.get('date', datetime.now().strftime('%Y-%m-%d'))} |
| Controller | {controller.get('name', 'N/A')} |
| DPO Contact | {controller.get('dpo_contact', 'N/A')} |
---
## 1. DPIA Threshold Assessment
**Result: {requirement['recommendation']}**
Risk Score: {requirement['risk_score']}/100
### Triggered Criteria
"""
if requirement['triggered_criteria']:
for criteria in requirement['triggered_criteria']:
report += f"- **{criteria['description']}** ({criteria['article']})\n"
else:
report += "- No mandatory triggers identified\n"
report += f"""
---
## 2. Description of Processing
### Purpose of Processing
{processing.get('description', 'Not specified')}
### Purposes
"""
for purpose in processing.get('purposes', ['Not specified']):
report += f"- {purpose}\n"
report += f"""
### Legal Basis
**{legal_info.get('article', 'Not specified')}**: {legal_info.get('description', processing.get('legal_basis', 'Not specified'))}
**Justification**: {processing.get('legal_basis_justification', 'Not provided')}
"""
if legal_info.get('requirements'):
report += "**Requirements to satisfy:**\n"
for req in legal_info['requirements']:
report += f"- {req}\n"
report += f"""
---
## 3. Data Subjects
| Aspect | Details |
|--------|---------|
| Categories | {', '.join(subjects.get('categories', ['Not specified']))} |
| Estimated Number | {subjects.get('estimated_number', 'Not specified')} |
| Vulnerable Groups | {'Yes - ' + subjects.get('vulnerable_groups_details', '') if subjects.get('vulnerable_groups') else 'No'} |
---
## 4. Personal Data Processed
### Data Categories
"""
for category in personal_data.get('categories', ['Not specified']):
report += f"- {category}\n"
if personal_data.get('special_categories'):
report += "\n### Special Category Data (Art. 9)\n\n"
for category in personal_data['special_categories']:
report += f"- **{category}** - Requires Art. 9(2) exception\n"
report += f"""
### Data Source
{personal_data.get('source', 'Not specified')}
### Retention Period
{personal_data.get('retention_period', 'Not specified')}
---
## 5. Processing Operations
| Operation | Details |
|-----------|---------|
| Collection Method | {operations.get('collection_method', 'Not specified')} |
| Storage Location | {operations.get('storage_location', 'Not specified')} |
| Access Controls | {operations.get('access_controls', 'Not specified')} |
| Automated Decisions | {'Yes' if operations.get('automated_decisions') else 'No'} |
| Profiling | {'Yes' if operations.get('profiling') else 'No'} |
---
## 6. Data Recipients
### Internal Recipients
"""
for recipient in recipients.get('internal', ['Not specified']):
report += f"- {recipient}\n"
report += "\n### External Processors\n\n"
for processor in recipients.get('external_processors', ['None']):
report += f"- {processor}\n"
if recipients.get('third_countries'):
report += "\n### Third Country Transfers\n\n"
report += "**Warning**: Transfers require Chapter V safeguards\n\n"
for country in recipients['third_countries']:
report += f"- {country}\n"
report += """
---
## 7. Risk Assessment
"""
for i, risk in enumerate(risks, 1):
report += f"""### Risk {i}: {risk['description']}
| Aspect | Assessment |
|--------|------------|
| Impact | {risk.get('impact', 'medium').upper()} |
| Likelihood | {risk.get('likelihood', 'medium').upper()} |
| Residual Risk | {risk.get('residual_risk', 'medium').upper()} |
**Recommended Mitigations:**
"""
for mitigation in risk.get('mitigations', []):
report += f"- {mitigation}\n"
report += "\n"
report += """---
## 8. Necessity and Proportionality
### Assessment Questions
1. **Is the processing necessary for the stated purpose?**
- [ ] Yes, no less intrusive alternative exists
- [ ] Alternative considered: _______________
2. **Is the data collection proportionate?**
- [ ] Only necessary data is collected
- [ ] Data minimization applied
3. **Are retention periods justified?**
- [ ] Retention period is necessary
- [ ] Deletion procedures in place
---
## 9. DPO Consultation
| Aspect | Details |
|--------|---------|
| DPO Consulted | [ ] Yes / [ ] No |
| DPO Name | |
| Consultation Date | |
| DPO Opinion | |
---
## 10. Sign-Off
| Role | Name | Signature | Date |
|------|------|-----------|------|
| Project Owner | | | |
| Data Protection Officer | | | |
| Controller Representative | | | |
---
## 11. Review Schedule
This DPIA should be reviewed:
- [ ] Annually
- [ ] When processing changes significantly
- [ ] Following a data incident
- [ ] As required by supervisory authority
Next Review Date: _______________
---
*Generated by DPIA Generator - This document requires completion and review by qualified personnel.*
"""
return report
def main():
parser = argparse.ArgumentParser(
description="Generate DPIA documentation"
)
parser.add_argument(
"--input", "-i",
help="Path to JSON input file with processing activity details"
)
parser.add_argument(
"--output", "-o",
help="Path to output file (default: stdout)"
)
parser.add_argument(
"--template",
action="store_true",
help="Output a blank JSON template"
)
parser.add_argument(
"--interactive",
action="store_true",
help="Run in interactive mode"
)
args = parser.parse_args()
if args.template:
print(json.dumps(get_template(), indent=2))
return
if args.interactive:
print("DPIA Generator - Interactive Mode")
print("=" * 40)
print("\nTo use this tool:")
print("1. Generate a template: python dpia_generator.py --template > input.json")
print("2. Fill in the template with your processing details")
print("3. Generate DPIA: python dpia_generator.py --input input.json --output dpia.md")
return
if not args.input:
print("Error: --input required (or use --template to get started)")
sys.exit(1)
input_path = Path(args.input)
if not input_path.exists():
print(f"Error: Input file not found: {input_path}")
sys.exit(1)
with open(input_path, "r") as f:
input_data = json.load(f)
report = generate_dpia_report(input_data)
if args.output:
with open(args.output, "w") as f:
f.write(report)
print(f"DPIA report written to {args.output}")
else:
print(report)
if __name__ == "__main__":
main()
#!/usr/bin/env python3
"""
GDPR Compliance Checker
Scans codebases, configurations, and data handling patterns for potential
GDPR compliance issues. Identifies personal data processing, consent gaps,
and documentation requirements.
Usage:
python gdpr_compliance_checker.py /path/to/project
python gdpr_compliance_checker.py . --json
python gdpr_compliance_checker.py /path/to/project --output report.json
"""
import argparse
import json
import os
import re
import sys
from pathlib import Path
from typing import Dict, List, Optional, Tuple
# Personal data patterns to detect
PERSONAL_DATA_PATTERNS = {
"email": {
"pattern": r"[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}",
"category": "contact_data",
"gdpr_article": "Art. 4(1)",
"risk": "medium"
},
"ip_address": {
"pattern": r"\b(?:\d{1,3}\.){3}\d{1,3}\b",
"category": "online_identifier",
"gdpr_article": "Art. 4(1), Recital 30",
"risk": "medium"
},
"phone_number": {
"pattern": r"(?:\+\d{1,3}[-.\s]?)?\(?\d{3}\)?[-.\s]?\d{3}[-.\s]?\d{4}",
"category": "contact_data",
"gdpr_article": "Art. 4(1)",
"risk": "medium"
},
"credit_card": {
"pattern": r"\b(?:\d{4}[-\s]?){3}\d{4}\b",
"category": "financial_data",
"gdpr_article": "Art. 4(1)",
"risk": "high"
},
"iban": {
"pattern": r"\b[A-Z]{2}\d{2}[A-Z0-9]{4}\d{7}(?:[A-Z0-9]?){0,16}\b",
"category": "financial_data",
"gdpr_article": "Art. 4(1)",
"risk": "high"
},
"german_id": {
"pattern": r"\b[A-Z0-9]{9}\b",
"category": "government_id",
"gdpr_article": "Art. 4(1)",
"risk": "high"
},
"date_of_birth": {
"pattern": r"\b(?:birth|dob|geboren|geburtsdatum)\b",
"category": "demographic_data",
"gdpr_article": "Art. 4(1)",
"risk": "medium"
},
"health_data": {
"pattern": r"\b(?:diagnosis|treatment|medication|patient|medical|health|symptom|disease)\b",
"category": "special_category",
"gdpr_article": "Art. 9(1)",
"risk": "critical"
},
"biometric": {
"pattern": r"\b(?:fingerprint|facial|retina|biometric|voice_print)\b",
"category": "special_category",
"gdpr_article": "Art. 9(1)",
"risk": "critical"
},
"religion": {
"pattern": r"\b(?:religion|religious|faith|church|mosque|synagogue)\b",
"category": "special_category",
"gdpr_article": "Art. 9(1)",
"risk": "critical"
}
}
# Code patterns indicating GDPR concerns
CODE_PATTERNS = {
"logging_personal_data": {
"pattern": r"(?:log|print|console)\s*\.\s*(?:info|debug|warn|error)\s*\([^)]*(?:email|user|name|address|phone)",
"issue": "Potential logging of personal data",
"gdpr_article": "Art. 5(1)(c) - Data minimization",
"recommendation": "Review logging to ensure personal data is not logged or is properly pseudonymized",
"severity": "high"
},
"missing_consent": {
"pattern": r"(?:track|analytics|marketing|cookie)(?!.*consent)",
"issue": "Tracking without apparent consent mechanism",
"gdpr_article": "Art. 6(1)(a) - Consent",
"recommendation": "Implement consent management before tracking",
"severity": "high"
},
"hardcoded_retention": {
"pattern": r"(?:retention|expire|ttl|lifetime)\s*[=:]\s*(?:null|undefined|0|never|forever)",
"issue": "Indefinite data retention detected",
"gdpr_article": "Art. 5(1)(e) - Storage limitation",
"recommendation": "Define and implement data retention periods",
"severity": "medium"
},
"third_party_transfer": {
"pattern": r"(?:api|http|fetch|request)\s*\.\s*(?:post|put|send)\s*\([^)]*(?:user|personal|data)",
"issue": "Potential third-party data transfer",
"gdpr_article": "Art. 28 - Processor requirements",
"recommendation": "Ensure Data Processing Agreement exists with third parties",
"severity": "medium"
},
"encryption_missing": {
"pattern": r"(?:password|secret|token|key)\s*[=:]\s*['\"][^'\"]+['\"]",
"issue": "Potentially unencrypted sensitive data",
"gdpr_article": "Art. 32(1)(a) - Encryption",
"recommendation": "Encrypt sensitive data at rest and in transit",
"severity": "critical"
},
"no_deletion": {
"pattern": r"(?:delete|remove|erase).*(?:disabled|false|TODO|FIXME)",
"issue": "Data deletion may be disabled or incomplete",
"gdpr_article": "Art. 17 - Right to erasure",
"recommendation": "Implement complete data deletion functionality",
"severity": "high"
}
}
# Configuration files to check for GDPR-relevant settings
CONFIG_PATTERNS = {
"analytics_config": {
"files": ["analytics.json", "gtag.js", "google-analytics.js"],
"check": "anonymize_ip",
"issue": "IP anonymization should be enabled for analytics",
"gdpr_article": "Art. 5(1)(c)"
},
"cookie_config": {
"files": ["cookie.config.js", "cookies.json"],
"check": "consent_required",
"issue": "Cookie consent should be required before non-essential cookies",
"gdpr_article": "Art. 6(1)(a)"
}
}
# File extensions to scan
SCANNABLE_EXTENSIONS = {
".py", ".js", ".ts", ".jsx", ".tsx", ".java", ".kt",
".go", ".rb", ".php", ".cs", ".swift", ".json", ".yaml",
".yml", ".xml", ".html", ".env", ".config"
}
# Files/directories to skip
SKIP_PATTERNS = {
"node_modules", "vendor", ".git", "__pycache__", "dist",
"build", ".venv", "venv", "env"
}
def should_skip(path: Path) -> bool:
"""Check if path should be skipped."""
return any(skip in path.parts for skip in SKIP_PATTERNS)
def scan_file_for_patterns(
filepath: Path,
patterns: Dict
) -> List[Dict]:
"""Scan a file for pattern matches."""
findings = []
try:
with open(filepath, "r", encoding="utf-8", errors="ignore") as f:
content = f.read()
lines = content.split("\n")
for pattern_name, pattern_info in patterns.items():
regex = re.compile(pattern_info["pattern"], re.IGNORECASE)
for line_num, line in enumerate(lines, 1):
matches = regex.findall(line)
if matches:
findings.append({
"file": str(filepath),
"line": line_num,
"pattern": pattern_name,
"matches": len(matches) if isinstance(matches, list) else 1,
**{k: v for k, v in pattern_info.items() if k != "pattern"}
})
except Exception as e:
pass # Skip files that can't be read
return findings
def analyze_project(project_path: Path) -> Dict:
"""Analyze project for GDPR compliance issues."""
personal_data_findings = []
code_issue_findings = []
config_findings = []
files_scanned = 0
# Scan all relevant files
for filepath in project_path.rglob("*"):
if filepath.is_file() and not should_skip(filepath):
if filepath.suffix.lower() in SCANNABLE_EXTENSIONS:
files_scanned += 1
# Check for personal data patterns
personal_data_findings.extend(
scan_file_for_patterns(filepath, PERSONAL_DATA_PATTERNS)
)
# Check for code issues
code_issue_findings.extend(
scan_file_for_patterns(filepath, CODE_PATTERNS)
)
# Check for specific config files
for config_name, config_info in CONFIG_PATTERNS.items():
for config_file in config_info["files"]:
config_path = project_path / config_file
if config_path.exists():
try:
with open(config_path, "r") as f:
content = f.read()
if config_info["check"] not in content.lower():
config_findings.append({
"file": str(config_path),
"config": config_name,
"issue": config_info["issue"],
"gdpr_article": config_info["gdpr_article"]
})
except Exception:
pass
# Calculate risk scores
critical_count = sum(1 for f in personal_data_findings if f.get("risk") == "critical")
critical_count += sum(1 for f in code_issue_findings if f.get("severity") == "critical")
high_count = sum(1 for f in personal_data_findings if f.get("risk") == "high")
high_count += sum(1 for f in code_issue_findings if f.get("severity") == "high")
medium_count = sum(1 for f in personal_data_findings if f.get("risk") == "medium")
medium_count += sum(1 for f in code_issue_findings if f.get("severity") == "medium")
# Determine compliance score (100 = compliant, 0 = critical issues)
score = 100
score -= critical_count * 20
score -= high_count * 10
score -= medium_count * 5
score -= len(config_findings) * 5
score = max(0, score)
# Determine compliance status
if score >= 80:
status = "compliant"
status_description = "Low risk - minor improvements recommended"
elif score >= 60:
status = "needs_attention"
status_description = "Medium risk - action required"
elif score >= 40:
status = "non_compliant"
status_description = "High risk - immediate action required"
else:
status = "critical"
status_description = "Critical risk - significant GDPR violations detected"
return {
"summary": {
"files_scanned": files_scanned,
"compliance_score": score,
"status": status,
"status_description": status_description,
"issue_counts": {
"critical": critical_count,
"high": high_count,
"medium": medium_count,
"config_issues": len(config_findings)
}
},
"personal_data_findings": personal_data_findings[:50], # Limit output
"code_issues": code_issue_findings[:50],
"config_issues": config_findings,
"recommendations": generate_recommendations(
personal_data_findings, code_issue_findings, config_findings
)
}
def generate_recommendations(
personal_data: List[Dict],
code_issues: List[Dict],
config_issues: List[Dict]
) -> List[Dict]:
"""Generate prioritized recommendations."""
recommendations = []
seen_issues = set()
# Critical issues first
for finding in code_issues:
if finding.get("severity") == "critical":
issue_key = finding.get("issue", "")
if issue_key not in seen_issues:
recommendations.append({
"priority": "P0",
"issue": finding.get("issue"),
"gdpr_article": finding.get("gdpr_article"),
"action": finding.get("recommendation"),
"affected_files": [finding.get("file")]
})
seen_issues.add(issue_key)
# Special category data
special_category_files = set()
for finding in personal_data:
if finding.get("category") == "special_category":
special_category_files.add(finding.get("file"))
if special_category_files:
recommendations.append({
"priority": "P0",
"issue": "Special category personal data (Art. 9) detected",
"gdpr_article": "Art. 9(1)",
"action": "Ensure explicit consent or other Art. 9(2) legal basis exists",
"affected_files": list(special_category_files)[:5]
})
# High priority issues
for finding in code_issues:
if finding.get("severity") == "high":
issue_key = finding.get("issue", "")
if issue_key not in seen_issues:
recommendations.append({
"priority": "P1",
"issue": finding.get("issue"),
"gdpr_article": finding.get("gdpr_article"),
"action": finding.get("recommendation"),
"affected_files": [finding.get("file")]
})
seen_issues.add(issue_key)
# Config issues
for finding in config_issues:
recommendations.append({
"priority": "P1",
"issue": finding.get("issue"),
"gdpr_article": finding.get("gdpr_article"),
"action": f"Update configuration in {finding.get('file')}",
"affected_files": [finding.get("file")]
})
return recommendations[:15]
def print_report(analysis: Dict) -> None:
"""Print human-readable report."""
summary = analysis["summary"]
print("=" * 60)
print("GDPR COMPLIANCE ASSESSMENT REPORT")
print("=" * 60)
print()
print(f"Compliance Score: {summary['compliance_score']}/100")
print(f"Status: {summary['status'].upper()}")
print(f"Assessment: {summary['status_description']}")
print(f"Files Scanned: {summary['files_scanned']}")
print()
counts = summary["issue_counts"]
print("--- ISSUE SUMMARY ---")
print(f" Critical: {counts['critical']}")
print(f" High: {counts['high']}")
print(f" Medium: {counts['medium']}")
print(f" Config Issues: {counts['config_issues']}")
print()
if analysis["recommendations"]:
print("--- PRIORITIZED RECOMMENDATIONS ---")
for i, rec in enumerate(analysis["recommendations"][:10], 1):
print(f"\n{i}. [{rec['priority']}] {rec['issue']}")
print(f" GDPR Article: {rec['gdpr_article']}")
print(f" Action: {rec['action']}")
print()
print("=" * 60)
print("Note: This is an automated assessment. Manual review by a")
print("qualified Data Protection Officer is recommended.")
print("=" * 60)
def main():
parser = argparse.ArgumentParser(
description="Scan project for GDPR compliance issues"
)
parser.add_argument(
"project_path",
nargs="?",
default=".",
help="Path to project directory (default: current directory)"
)
parser.add_argument(
"--json",
action="store_true",
help="Output in JSON format"
)
parser.add_argument(
"--output", "-o",
help="Write output to file"
)
args = parser.parse_args()
project_path = Path(args.project_path).resolve()
if not project_path.exists():
print(f"Error: Path does not exist: {project_path}", file=sys.stderr)
sys.exit(1)
analysis = analyze_project(project_path)
if args.json:
output = json.dumps(analysis, indent=2)
if args.output:
with open(args.output, "w") as f:
f.write(output)
print(f"Report written to {args.output}")
else:
print(output)
else:
print_report(analysis)
if args.output:
with open(args.output, "w") as f:
json.dump(analysis, f, indent=2)
print(f"\nDetailed JSON report written to {args.output}")
if __name__ == "__main__":
main()
Install this Skill
Skills give your AI agent a consistent, structured approach to this task — better output than a one-off prompt.
npx skills add alirezarezvani/claude-skills --skill ra-qm-team/gdpr-dsgvo-expert Community skill by @alirezarezvani. Need a walkthrough? See the install guide →
Works with
Prefer no terminal? Download the ZIP and place it manually.
Details
- Category
- Compliance
- License
- MIT
- Author
- @alirezarezvani
- Source
- GitHub →
- Source file
-
show path
ra-qm-team/gdpr-dsgvo-expert/SKILL.md
People who install this also use
CISO Advisor
Information security leadership — risk quantification, compliance roadmaps (SOC2, ISO 27001), security architecture, and board-level security reporting.
@alirezarezvani
Information Security Manager (ISO 27001)
Implement and manage an ISMS per ISO 27001/27002 — risk assessments, security controls, incident management, and certification readiness.
@alirezarezvani
Quality Manager (QMR)
Quality Management Representative accountability — QMS governance, management review facilitation, compliance dashboards, and overall quality system ownership.
@alirezarezvani