LightBeam Documentation
Installer GuidesData SourcesPlaybooksInsightsPrivacyOpsGovernance
  • 💡What is LightBeam?
  • 🚀Getting Started
    • ⚙️Installer Guides
      • Pre-Requisites / Security Configurations
        • Firewall Requirements
        • Securing LightBeam on EKS with AWS Certificate Manager on Elastic Load Balancer
        • Configure HTTPS for LightBeam Endpoint FQDN Standalone deployment
        • Using Custom Certificates with LightBeam
        • Securing LightBeam on GKE with Google Certificate Manager and GCE Ingress
      • Core
        • LightBeam Deployment Instructions
        • LightBeam Installer
        • Web App Deployment
        • LightBeam Diagnostics
        • LightBeam Cluster Backup & Restore using Velero
      • Platform Specific
        • AWS
        • Microsoft Azure
        • Google Cloud (GKE)
        • Standalone Virtual Machine
        • Deployment on an Existing Managed Kubernetes Cluster
        • Azure Marketplace Deployment
      • Integration and Setup
        • Setting Up AWS PrivateLink for RDS-EKS Interaction
        • Twingate and LightBeam Integration Guide
        • Data Subject Request Web Application Server
        • Generate CSR for LightBeam
  • 🧠Core Features
    • 🔦Spectra AI
      • 🔗Data Sources
        • Cloud Platforms
          • AWS Auto Discovery
          • GCP Auto Discovery
        • Databases and Datalakes
          • PostgreSQL
          • Aurora (PostgreSQL)
          • Snowflake
          • MS SQL
          • MySQL
          • Aurora (MySQL)
          • BigQuery
          • AWS Redshift
          • Oracle
          • DynamoDB
          • MongoDB
          • CosmosDB (PostgreSQL)
          • CosmosDB (MongoDB)
          • CosmosDB (NoSQL)
          • Looker
          • AWS Glue
          • Databricks
          • SAP HANA
          • CSV Files as a Datasource
        • Messaging
          • Gmail
          • Slack
          • MS Teams
          • MS Outlook
        • Developer Tools
          • Zendesk
          • ServiceNow
          • Jira
          • GitHub
          • Confluence
        • File Repositories
          • NetDocuments
          • AWS S3
          • Azure Blob
          • Google Drive
          • OneDrive
          • SharePoint
          • Viva Engage
          • Dropbox
          • Box
          • SMB
        • CRM
          • Hubspot
          • Salesforce
          • Automated Data Processing (ADP)
          • Marketo
          • Iterable
          • MS Dynamics 365 Sales
          • Salesforce Marketing Cloud
      • 🔔PlayBooks
        • What is LightBeam Playbooks?
        • Policy and Alerts
          • Types of Policies
          • How to create a rule set
            • File Extension Filter
          • Configuring Retention Policies
          • Viewing Alerts
          • Sub Alerts
            • Reassigning Sub-Alerts
            • Sub-alert States
          • Levels of Actions on Alerts
          • User Roles and Permissions
            • Admin View
            • Alert Owner View
            • Onboarding New Users
              • User Management
              • Okta Integration
              • Alert Assignment Settings
              • Email Notifications
            • Planned Enhancements
          • Audit Logs
          • No Scan List
          • Permit List
          • Policy in read-only mode
      • 📊Insights
        • Entity Workflow
        • Document Classification
        • Attribute Management Overview
          • Attributes Page View
          • Attribute Sets
          • Creating Custom Attribute
          • Attributes List
        • Template Builder
        • Label Management
          • MIP Integration
          • Google Labels Integration
      • 🗃️Reporting
        • Delta Reporting
        • Executive Report
        • LightBeam Lens
      • Scanning and Redaction of Files
        • On-demand scanning
      • How-to Guides
        • Leveraging LightBeam insights for structured data sources
      • LightBeam Dashboard Outlay
      • Risk Score
    • 🏛️PrivacyOps
      • Data Subject Request (DSR)
        • What is DSR?
        • Accessing the DSR Module
        • DSR Form Builder (DPO View)
          • Creating a New DSR Form
            • Using a Predefined Template
            • Creating a Custom Form
          • Form Configuration
          • Form Preview and Publishing
          • Multi-Form Management
          • Messaging Templates
        • Form Submission & Email Verification (Data Subject View)
        • DSR Management Dashboard (DPO View)
        • Processing DSR Requests
          • Data Protection Officer (DPO) Workflow
          • Self Service Workflow (Direct Validation)
          • Data Source Owner (DSO) Workflow
        • DSR Report
      • 🚧Consent Management
        • Overview
        • Consent Logs
        • Preference Centre
        • Settings
      • 🍪Cookie Consent
        • Dashboard
        • Banners
        • Domains
        • Settings
        • CMP Deployment Guide for Google Tag Manager
        • FAQs
      • 🔏Privacy Impact Assessment (PIA)
        • PIA Templates
        • PIA Assessment Workflow
        • Collaborator View
        • Process Owner Login View (With Collaborator)
        • Filling questionnaire without collaborator
        • Submitting the assessment for DPO review
        • DPO review process
        • Marking the assessment as reviewed
        • Editing and resubmitting assessments after DPO review
        • Revoke review request
        • Edit Reviewer
        • PIA Reports
      • ⏺️Records of Processing Activity (RoPA)
        • Creating a RoPA Template
          • How to clone a template
          • How to use a template
        • How to create a process
          • Adding Process Details
          • Adding Data Elements
          • Adding Data Subjects
          • Adding Data Retention
          • Adding Safeguards
          • Adding Transfers
          • Adding a Custom Section
          • Setting a Review Schedule
          • Data Flow Diagram
        • How to add a collaborator
        • Overview Section
        • Generating a RoPA Report Using LightBeam
        • Collaborator working on a ticket
    • 🛡️Governance
      • Access
        • Dashboard
        • Users
        • Groups
        • Objects
        • Active Directory Settings
        • Access Governance at a Data Source Level
        • Policies and Alerting
        • Access Governance Statistics
        • Governance Module Dashboard
      • Privacy At Partners
  • 📊Tools & Resources
    • 🔀API Documentation
      • API to Create Reports for Structured Datasource
    • ❓Onboarding Assessments
      • Structured Datasource Onboarding Questionnaire
        • MongoDB/CosmosDB Questionnaire
        • Oracle Datasource Questionnaire
      • SMB Questionnaire
    • 🛠️Administration
      • Audit Logs
      • SMTP
        • Basic and oAuth Configuration
      • User Management
        • SAML Identity Providers
          • Okta
            • LightBeam Okta SAML Configuration Guide
          • Azure
            • Azure AD SAML Configuration for LightBeam
          • Google
            • Google IDP
        • Local User Management
          • Adding a User to the LightBeam Dashboard
          • Reset Default Admin Password
  • 📚Support & Reference
    • 📅Release Notes
      • LightBeam v2.2.0
      • Reporting Release Notes
      • Q1 2024 Key Enhancements
      • Q2 2024 Key Enhancements
      • Q3 2024 Key Enhancements
      • Q4 2024 Key Enhancements
    • 📖Glossary
Powered by GitBook
On this page
  • Overview
  • About SAP Hana
  • Features
  • Onboarding SAP HANA Data Source
  • APPENDIX
  • Minimal permissions setup
  • About LightBeam
  1. Core Features
  2. Spectra AI
  3. Data Sources
  4. Databases and Datalakes

SAP HANA

Connecting SAP HANA to LightBeam


Overview

LightBeam Spectra users can connect various data sources to the LightBeam application and these data sources will be continuously monitored for PII, PHI data.

Example: SAP Hana, AWS Glue, Looker, DynamoDB, etc.


About SAP Hana

SAP HANA is an in-memory, column-oriented relational database management system that enables real-time analytics and application processing.

A schema in SAP HANA is a logical container for database objects (tables, views, procedures etc.) that will be onboarded with Lightbeam. We find sensitive data present in SAP HANA schemas which are managed through Schema privileges and roles. Users can onboard schemas with their associated tables, and Lightbeam will scan all the tables inside those schemas.

SAP HANA works directly with its in-memory database engine to execute SQL queries and sample data from tables. Each schema needs proper SELECT privileges assigned for lightbeam to access and scan the data for sensitive information.


Features

Datasource Registration

SAP HANA admins can create a user with restricted user permissions and username and password for that created user for registration. The users will be provided a list of schemas, they can filter schemas that they wish to scan.

Metadata Scanning

We scan the tables present in the schemas configured in scan conditions. For each table, we get the list of columns part of the table, their data types etc. We also fetch row count, size of the table if they are available.

PII Detection

For PII detection, we need sample data for all the columns for a table. For reading data, we sample 5000 rows for each table.


Onboarding SAP HANA Data Source

  1. Login to your LightBeam Instance.

  2. Click on DATASOURCES on the Top Navigation Bar.

  3. Click on “Add a data source”.

  1. Search for SAP HANA.

  1. Click on SAP HANA.

  2. Configure Basic Details

    In the Basic Details section, enter the following information:

    • Instance Name: Provide a unique name for the SAP HANA data source (e.g., sap-hana-datasource).

    • Primary Owner: Enter the email address of the individual responsible for this data source (e.g., demo@lightbeam.ai).

    • Source of Truth (Optional): Toggle this option on if this database serves as a single source of truth for entity validation.

    • Description (Optional): Add a brief description of the database (e.g., "SAP HANA Datasource Instance").

  1. Enter Connection Details

Provide the following details in the Connection section:

  • Username: The username for database authentication.

  • Password: The corresponding password for the username.

  • Host: The SAP HANA server hostname or IP address (e.g., sap-hana.mycompany.com or 192.168.1.100).

  • Port: The SAP HANA connection port (default 30015).

  1. Click Test Connection to validate the credentials.

  2. Additional Details (Optional)

In this section, you can specify metadata attributes related to the data source:

  • Location: The location of the data source.

  • Purpose: The purpose of the data being collected/processed.

  • Stage: The stage of the data source. Example: Source, Processing, Archival, etc.

  1. Verify that you get the message Connection Success! on the screen. Click on Next.

  1. On the next screen, you will see a list of schemas from dropdown presented. Select schemas that you wish to scan.

Please verify that all databases selected for scanning show up in the list of databases. Ensure you've made your desired selections before connecting the data source.

  1. Finally, click on Start Sampling to connect to the SAP HANA data source.


APPENDIX

Minimal permissions setup

We require the following permissions to scan only a subset of the databases for the instance:

  • Connect permissions

  • For each database - SELECT permissions

Use the following scripts to:

  1. Create a database user with a username and a password.

CREATE USER  PASSWORD  NO FORCE_FIRST_PASSWORD_CHANGE;
  1. Grant Permission to schemas that are to be scanned.

For granting permission to a single schema:

GRANT SELECT, SELECT METADATA ON SCHEMA  TO ;

For granting permission to multiple schemas at once, execute this.

replace list of schemas with schemas for which permissions are needed and lbadmin with username

DO BEGIN
    DECLARE schema_name NVARCHAR(256);
    DECLARE schema_list NVARCHAR(256) ARRAY;
    DECLARE i INT;
    schema_list = ARRAY();
    FOR i IN 1 .. CARDINALITY(:schema_list) DO
        schema_name = :schema_list[i];
        EXEC 'GRANT SELECT, SELECT METADATA ON SCHEMA ' || :schema_name || ' TO lbadmin';
    END FOR;
END;

Validate permissions to the datasource.

Next, the user needs to validate these permissions to the datasource. This ensures authorized access to the datasource by the credentials provided by the user. After validating the permissions to the datasource, the user can onboard SAP HANA in Lightbeam.

Steps

  1. Go into sql_user_check_sap_hana directory

  2. Please refer to the README.md file in the directory for detailed instructions.


About LightBeam

LightBeam automates Privacy, Security, and AI Governance, so businesses can accelerate their growth in new markets. Leveraging generative AI, LightBeam has rapidly gained customers’ trust by pioneering a unique privacy-centric and automation-first approach to security. Unlike siloed solutions, LightBeam ties together sensitive data cataloging, control, and compliance across structured and unstructured data applications providing 360-visibility, redaction, self-service DSRs, and automated ROPA reporting ensuring ultimate protection against ransomware and accidental exposures while meeting data privacy obligations efficiently. LightBeam is on a mission to create a secure privacy-first world helping customers automate compliance against a patchwork of existing and emerging regulations.

PreviousDatabricksNextCSV Files as a Datasource

Last updated 3 months ago

Figure 1. Add a datasource

First, clone the repository

For any questions or suggestions, please get in touch with us at: .

🧠
🔦
🔗
https://github.com/lightbeamai/lb-installer
support@lightbeam.ai
Figure 2. Search for SAP HANA
Figure 3. SAP HANA Datasource
Figure 4. SAP HANA - Basic Information & Connection Details
Figure 5. Select Schemas