Mongo DB
Guide to Integrate your Mongo DB with Sprinkle
Data Import Concepts
Before setting up the data import, learn about data import concepts here
Step-by-Step Guide
STEP-1: Allow Mongo to accept connection from Sprinkle
Network Connectivity:
If Mongo server is on public network, accessible over public IP, allow inbound connection on mongo port (default is 27017) from Sprinkle IPs (34.93.254.126, 34.93.106.136)
If Mongo server is on private network, configure SSH Tunnel in Advanced Settings.
Create a Read-Only user, providing any name like "sprinkle"
STEP-2: Configure Mongo Connection
To learn about Connection, refer here
Log into the Sprinkle application
Navigate to Ingest -> Connections Tab -> Setup Connections ->
Select Mongo database
Provide all the mandatory details
Distinct Name: Name to identify this connection
Host: Provide the IP address or Host name.
Port: Provide the Port number.
Database: Provide a database name if there is any, it should be an existing database.
Username
Password
Advanced Settings: If Yes:-
Connection Scheme: Enter the connection Schema.
Connection Properties: You can provide optional connection properties. ex- key1=value1&key2=value2. The most common properties used for mongo is authSource=admin
Binlog Replication: If enabled, mongo binlog is used to fetch changed/new rows. This setting cannot be changed later once selected. For more details see this
Connect via SSH Host: If Yes:-
SSH Host: IP address or hostname of the SSH server.
SSH Public Key: Add this public key to the ~/.ssh/authorized_keys file on the ssh host machine.
SSH Login Username
Test Connection
Create
STEP-3: Configure Mongo data import
To learn about data import, refer here
Navigate to Ingest -> Data Imports Tab -> Setup Sources ->
Select Mongo database
Provide the name -> Create
Connection Tab:
From the drop-down, select the name of the connection created in STEP-2
Update
STEP-4: Create a Dataset
Datasets Tab: To learn about Dataset, refer here. Add a Dataset for each collection that you want to replicate, providing the following details:
Database Name (Required): Database name to select the collection from
Collection Name (Required): Collection to ingest
Destination Schema (Required): Data warehouse schema where the table will be ingested into
Destination Table name (Required): It is the table name to be created on the warehouse. If not given, sprinkle will create like ds_<datasourcename>_<tablename>.
Destination Create Table Clause: Provide additional clauses to warehouse-create table queries such as clustering, partitioning, and more, useful for optimizing DML statements. Learn more on how to use this field.
Ingestion Mode: (Required)
Complete: Ingest full data from the source table in every ingestion job run. Choose this option if your table size is small (<1 million rows) and you want to ingest it infrequently (a few times a day)
Incremental: Ingest only the changed or inserted rows in every ingestion job run. Choose this option if your table size is large and you want to ingest in real-time mode.
Time Column needs to be provided here
Time Column Type:
Time Column Name: Note: Time Column Type can be of three different data types:- String, Date, or Numeric. To Know more about Ingestion Modes, refer here
Periodic Full Fetch (Required): If yes, periodically full table will be fetched from the source. Would be useful if data gets deleted from the source and you want to keep the data in sync with the source. If not, only incremental data will be pulled in every run.
Full Fetch Interval (Required): This will fetch complete rows in the table based on the interval selected. In this, an interval is according to the days of the week (including every night). For example, Every Sunday / Every Monday / Every Night, etc.
Automatic Schema (Required):
Yes: Schema is automatically discovered by Sprinkle (Recommended)
Flatten Level (Required): Select from One Level or Multi Level. In one level, flattening will not be applied on the complex type. They will be stored as a string. In multi-level, flattening will be applied in complex level till they become simple type.
No: Warehouse Schema to be provided Format for Warehouse schema is : Col1 datatype, Col2 datatype,Col3 datatype Datatype should be warehouse specific.
Filter: For Date column filter select the Date type and give the value in the string(format:yyyy-mm-dd) or long format. For other data type filter must give value in a proper format.
Date Type Filter, Other Date Type Filter:
Filter JSON: Provide filter json enclosed in {}. like {"key1":{"$gte":value1},"key2":{"$gt":value2}}. for more details
No Filter: No filter is required
STEP-5: Run and schedule Ingestion
In the Ingestion Jobs tab:
Trigger the Job, using the Run button
To schedule, enable Auto-Run. Change the frequency if needed
Last updated