LightBeam Documentation
Installer GuidesData SourcesPlaybooksInsightsPrivacyOpsGovernance
  • πŸ’‘What is LightBeam?
  • πŸš€Getting Started
    • βš™οΈInstaller Guides
      • Pre-Requisites / Security Configurations
        • Firewall Requirements
        • Securing LightBeam on EKS with AWS Certificate Manager on Elastic Load Balancer
        • Configure HTTPS for LightBeam Endpoint FQDN Standalone deployment
        • Using Custom Certificates with LightBeam
        • Securing LightBeam on GKE with Google Certificate Manager and GCE Ingress
      • Core
        • LightBeam Deployment Instructions
        • LightBeam Installer
        • Web App Deployment
        • LightBeam Diagnostics
        • LightBeam Cluster Backup & Restore using Velero
      • Platform Specific
        • AWS
        • Microsoft Azure
        • Google Cloud (GKE)
        • Standalone Virtual Machine
        • Deployment on an Existing Managed Kubernetes Cluster
        • Azure Marketplace Deployment
      • Integration and Setup
        • Setting Up AWS PrivateLink for RDS-EKS Interaction
        • Twingate and LightBeam Integration Guide
        • Data Subject Request Web Application Server
        • Generate CSR for LightBeam
  • 🧠Core Features
    • πŸ”¦Spectra AI
      • πŸ”—Data Sources
        • Cloud Platforms
          • AWS Auto Discovery
          • GCP Auto Discovery
        • Databases and Datalakes
          • PostgreSQL
          • Aurora (PostgreSQL)
          • Snowflake
          • MS SQL
          • MySQL
          • Aurora (MySQL)
          • BigQuery
          • AWS Redshift
          • Oracle
          • DynamoDB
          • MongoDB
          • CosmosDB (PostgreSQL)
          • CosmosDB (MongoDB)
          • CosmosDB (NoSQL)
          • Looker
          • AWS Glue
          • Databricks
          • SAP HANA
          • CSV Files as a Datasource
        • Messaging
          • Gmail
          • Slack
          • MS Teams
          • MS Outlook
        • Developer Tools
          • Zendesk
          • ServiceNow
          • Jira
          • GitHub
          • Confluence
        • File Repositories
          • NetDocuments
          • AWS S3
          • Azure Blob
          • Google Drive
          • OneDrive
          • SharePoint
          • Viva Engage
          • Dropbox
          • Box
          • SMB
        • CRM
          • Hubspot
          • Salesforce
          • Automated Data Processing (ADP)
          • Marketo
          • Iterable
          • MS Dynamics 365 Sales
          • Salesforce Marketing Cloud
      • πŸ””PlayBooks
        • What is LightBeam Playbooks?
        • Policy and Alerts
          • Types of Policies
          • How to create a rule set
            • File Extension Filter
          • Configuring Retention Policies
          • Viewing Alerts
          • Sub Alerts
            • Reassigning Sub-Alerts
            • Sub-alert States
          • Levels of Actions on Alerts
          • User Roles and Permissions
            • Admin View
            • Alert Owner View
            • Onboarding New Users
              • User Management
              • Okta Integration
              • Alert Assignment Settings
              • Email Notifications
            • Planned Enhancements
          • Audit Logs
          • No Scan List
          • Permit List
          • Policy in read-only mode
      • πŸ“ŠInsights
        • Entity Workflow
        • Document Classification
        • Attribute Management Overview
          • Attributes Page View
          • Attribute Sets
          • Creating Custom Attribute
          • Attributes List
        • Template Builder
        • Label Management
          • MIP Integration
          • Google Labels Integration
      • πŸ—ƒοΈReporting
        • Delta Reporting
        • Executive Report
        • LightBeam Lens
      • Scanning and Redaction of Files
        • On-demand scanning
      • How-to Guides
        • Leveraging LightBeam insights for structured data sources
      • LightBeam Dashboard Outlay
      • Risk Score
    • πŸ›οΈPrivacyOps
      • Data Subject Request (DSR)
        • What is DSR?
        • Accessing the DSR Module
        • DSR Form Builder (DPO View)
          • Creating a New DSR Form
            • Using a Predefined Template
            • Creating a Custom Form
          • Form Configuration
          • Form Preview and Publishing
          • Multi-Form Management
          • Messaging Templates
        • Form Submission & Email Verification (Data Subject View)
        • DSR Management Dashboard (DPO View)
        • Processing DSR Requests
          • Data Protection Officer (DPO) Workflow
          • Self Service Workflow (Direct Validation)
          • Data Source Owner (DSO) Workflow
        • DSR Report
      • 🚧Consent Management
        • Overview
        • Consent Logs
        • Preference Centre
        • Settings
      • πŸͺCookie Consent
        • Dashboard
        • Banners
        • Domains
        • Settings
        • CMP Deployment Guide for Google Tag Manager
        • FAQs
      • πŸ”Privacy Impact Assessment (PIA)
        • PIA Templates
        • PIA Assessment Workflow
        • Collaborator View
        • Process Owner Login View (With Collaborator)
        • Filling questionnaire without collaborator
        • Submitting the assessment for DPO review
        • DPO review process
        • Marking the assessment as reviewed
        • Editing and resubmitting assessments after DPO review
        • Revoke review request
        • Edit Reviewer
        • PIA Reports
      • ⏺️Records of Processing Activity (RoPA)
        • Creating a RoPA Template
          • How to clone a template
          • How to use a template
        • How to create a process
          • Adding Process Details
          • Adding Data Elements
          • Adding Data Subjects
          • Adding Data Retention
          • Adding Safeguards
          • Adding Transfers
          • Adding a Custom Section
          • Setting a Review Schedule
          • Data Flow Diagram
        • How to add a collaborator
        • Overview Section
        • Generating a RoPA Report Using LightBeam
        • Collaborator working on a ticket
    • πŸ›‘οΈGovernance
      • Access
        • Dashboard
        • Users
        • Groups
        • Objects
        • Active Directory Settings
        • Access Governance at a Data Source Level
        • Policies and Alerting
        • Access Governance Statistics
        • Governance Module Dashboard
      • Privacy At Partners
  • πŸ“ŠTools & Resources
    • πŸ”€API Documentation
      • API to Create Reports for Structured Datasource
    • ❓Onboarding Assessments
      • Structured Datasource Onboarding Questionnaire
        • MongoDB/CosmosDB Questionnaire
        • Oracle Datasource Questionnaire
      • SMB Questionnaire
    • πŸ› οΈAdministration
      • Audit Logs
      • SMTP
        • Basic and oAuth Configuration
      • User Management
        • SAML Identity Providers
          • Okta
            • LightBeam Okta SAML Configuration Guide
          • Azure
            • Azure AD SAML Configuration for LightBeam
          • Google
            • Google IDP
        • Local User Management
          • Adding a User to the LightBeam Dashboard
          • Reset Default Admin Password
  • πŸ“šSupport & Reference
    • πŸ“…Release Notes
      • LightBeam v2.2.0
      • Reporting Release Notes
      • Q1 2024 Key Enhancements
      • Q2 2024 Key Enhancements
      • Q3 2024 Key Enhancements
      • Q4 2024 Key Enhancements
    • πŸ“–Glossary
Powered by GitBook
On this page
  1. Core Features
  2. PrivacyOps
  3. Records of Processing Activity (RoPA)
  4. How to create a process

Data Flow Diagram

PreviousSetting a Review ScheduleNextHow to add a collaborator

Last updated 1 year ago

A Data Flow Diagram (DFD) is a visual representation of the flow of data within a system or process. It illustrates how data moves from one entity to another, showing the various processes, data stores, and external entities involved in the system. In the context of GDPR (General Data Protection Regulation), a DFD can be particularly useful for understanding and analyzing how personal data is handled within an organization or system, ensuring compliance with GDPR requirements. Collection: This stage involves gathering data from various sources. Data can be collected from sources such as online forms, sensors, databases, social media platforms, or manual data entry. In the context of GDPR, it's crucial to ensure that data collection practices comply with the regulation's requirements, such as obtaining explicit consent from data subjects when necessary and providing transparent information about the purposes of data collection. Following are the five ways data flows:

Processing: Once data is collected, it undergoes processing, which involves manipulating, analyzing, or transforming it in some way to derive insights or fulfill specific purposes. Processing activities can include data cleansing, aggregation, analysis, and enrichment. It's essential to implement appropriate security measures during data processing to safeguard personal data and ensure compliance with GDPR principles such as data accuracy and confidentiality.

Storage: After processing, data is typically stored in data repositories or databases for future use. This stage involves choosing suitable storage mechanisms and implementing security measures to protect data from unauthorized access, alteration, or deletion. GDPR mandates organizations to ensure the security and integrity of personal data during storage, including measures such as encryption, access controls, and regular data backups.

Exchange: Data exchange refers to the transfer of data between different systems, applications, or organizations. This stage may involve sharing data with third parties, partners, or other internal departments. GDPR imposes restrictions on cross-border data transfers and requires organizations to ensure that any data transfers outside the European Economic Area (EEA) comply with GDPR's data transfer mechanisms, such as Standard Contractual Clauses (SCCs) or Binding Corporate Rules (BCRs).

Archival: In this stage, data that is no longer actively used but may still have historical or regulatory value is archived for long-term retention. Archiving involves storing data in a secure and accessible manner, often in offline or offsite storage facilities. GDPR imposes specific requirements on data retention and deletion, requiring organizations to establish policies and procedures for the timely disposal of personal data once it is no longer needed for its original purpose. Here, the process owner as well as the collaborator adds data sources and its simultaneous flows.

  1. Click on Add Data Source as shown in Fig 1.

Fig 2 Add Data Source
  1. Next, select the stage for this data source. A drop down list will appear as shown in Fig 2. Click on the appropriate stage.

  1. Select source and the target of the data source and the processing stage as shown in Fig 3.

Source: Within a system or process, a data source is typically the starting point where data is initially collected or obtained before being transferred or processed further. For example, in a customer relationship management (CRM) system, data sources could include online forms, customer interactions, or data imports from external systems.

Target: A data target is the endpoint of data movement, where data is delivered or stored after being processed or transferred from a data source. For example, in an e-commerce system, the data target for order information could be a database where order details are stored for further processing and analysis.

  1. Click on Diagram as shown in Fig 4. This will show you a diagram of the data flow map.

  2. Click on Save or Proceed to Reports to move forward.

  3. Generate a report as shown in Fig 5.

Fig 2 Select stage for data source
Fig 3 Data Flow Diagram
Fig 5 Generate a report
🧠
πŸ›οΈ
⏺️