Adapters
Import semantic models from Cube, MetricFlow, LookML, Hex, Rill, Superset, Omni, BSL, Malloy, Snowflake Cortex, OSI, AtScale SML, GoodData, Holistics, and ThoughtSpot into Sidemantic
Sidemantic can import semantic models from other popular semantic layer formats, letting you use your existing metric definitions with Sidemantic's query engine and features.
Supported Formats
| Format | Import | Notes |
|---|---|---|
| Sidemantic (native) | ✅ | Full feature support |
| Cube | ✅ | No native segments |
| MetricFlow (dbt) | ✅ | No native segments or hierarchies |
| LookML (Looker) | ✅ | Liquid templating (not Jinja) |
| Hex | ✅ | No segments or cross-model derived metrics |
| Rill | ✅ | No relationships, segments, or cross-model metrics; single-model only |
| Superset (Apache) | ✅ | No relationships in datasets |
| Omni | ✅ | Relationships in separate model file |
| BSL (Boring Semantic Layer) | ✅ | Ibis-style expressions; supports roundtrip export |
| Malloy | ✅ | Parsed via .malloy files |
| Snowflake Cortex | ✅ | Snowflake semantic model YAML |
| OSI (Open Semantic Interchange) | ✅ | Vendor-agnostic semantic model YAML; supports import/export |
| AtScale SML | ✅ | Repository-style SML import/export |
| GoodData LDM | ✅ | Cloud and legacy LDM JSON import/export |
| Holistics AML | ✅ | AML model import/export |
| ThoughtSpot TML | ✅ | Table and worksheet TML import/export |
Feature Compatibility
This table shows which Sidemantic features are supported when importing from other formats:
| Feature | Sidemantic | Cube | MetricFlow | LookML | Hex | Rill | Superset | Omni | BSL | Notes |
|---|---|---|---|---|---|---|---|---|---|---|
| Models | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | All formats support models/tables |
| Dimensions | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | All formats support dimensions |
| Simple Metrics | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | All formats support sum, count, avg, min, max |
| Time Dimensions | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | All formats support time dimensions with granularity |
| Relationships | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ | Rill/Superset: single-model only; Omni: in model file |
| Derived Metrics | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | All formats support calculated metrics |
| Metric Filters | ✅ | ✅ | ❌ | ✅ | ✅ | ⚠️ | ❌ | ✅ | ❌ | Rill has basic support; Superset/BSL lack filters |
| Ratio Metrics | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ❌ | Rill/Superset/BSL don't have native ratio metric type |
| Segments | ✅ | ✅ | ❌ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | Only Cube and LookML have native segment support |
| Cumulative Metrics | ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | Cube has rolling_window; MetricFlow has cumulative; others lack native support |
| Time Comparison | ✅ | ❌ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | Only MetricFlow has native time comparison metrics |
| Jinja Templates | ⚠️ | ✅ | ✅ | ⚠️ | ✅ | ✅ | ✅ | ✅ | ❌ | Sidemantic renders templates only in filters; LookML uses Liquid; BSL uses Ibis |
| Hierarchies | ✅ | ⚠️ | ❌ | ⚠️ | ❌ | ❌ | ❌ | ⚠️ | ❌ | Cube/LookML/Omni: via drill_fields |
| Inheritance | ✅ | ❌ | ❌ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | Only LookML has native extends support |
| Metadata Fields | ✅ | ⚠️ | ⚠️ | ⚠️ | ⚠️ | ⚠️ | ✅ | ✅ | ✅ | Label and description support varies by format |
| Parameters | ⚠️ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | Deprecated; Python-only; filter interpolation only |
| Ungrouped Queries | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | Sidemantic-only feature |
Legend:
- ✅ Full support - feature fully supported on import
- ⚠️ Partial support - feature works with limitations
- ❌ Not supported - feature not available in source format
Malloy, Snowflake Cortex, OSI, AtScale SML, GoodData LDM, Holistics AML, and ThoughtSpot TML are supported for import. Detailed mapping notes for each are included below.
Importing into Sidemantic
Quick Start: Auto-Discovery
The easiest way to load semantic models from any format:
from sidemantic import SemanticLayer, load_from_directory
# Point at a directory with mixed formats
layer = SemanticLayer(connection="duckdb:///data.db")
load_from_directory(layer, "semantic_models/")
# That's it! Automatically:
# - Discovers .sql, .lkml, .malloy, .json, .aml, .tml, and .yml/.yaml files
# - Detects formats: Sidemantic, Cube, MetricFlow, LookML, Hex, BSL, Snowflake Cortex, Omni, Rill, Superset, OSI, GoodData, Holistics, ThoughtSpot
# - Detects AtScale SML repositories by catalog/object_type structure
# - Parses with the right adapter
# - Infers relationships from foreign key naming
# - Builds the join graph
How Relationship Inference Works
load_from_directory() automatically infers relationships based on foreign key naming conventions:
orders.customer_id→customers.id(many-to-one)line_items.order_id→orders.id(many-to-one)products.category_id→categories.id(many-to-one)
It tries both singular and plural forms, so customer_id will match both customer and customers tables.
Reverse relationships (one-to-many) are automatically added to the target model.
Manual Adapter Usage
For more control over the import process, you can use adapters directly:
From Cube
Read Cube.js semantic models into Sidemantic:
from sidemantic.adapters.cube import CubeAdapter
# Import from Cube YAML
adapter = CubeAdapter()
graph = adapter.parse("cube/schema/Orders.yml")
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT orders.revenue FROM orders")
uv run --with sidemantic python -c "from sidemantic.adapters.cube import CubeAdapter; print(CubeAdapter.__name__)"
CubeAdapter
From MetricFlow
Read dbt MetricFlow models into Sidemantic:
from sidemantic.adapters.metricflow import MetricFlowAdapter
# Import from MetricFlow YAML
adapter = MetricFlowAdapter()
graph = adapter.parse("models/metrics/") # Directory of YAML files
# Query with Sidemantic
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT orders.revenue FROM orders")
From LookML
Read Looker LookML views into Sidemantic:
from sidemantic.adapters.lookml import LookMLAdapter
# Import from LookML
adapter = LookMLAdapter()
graph = adapter.parse("views/orders.lkml") # Single file or directory
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT orders.revenue FROM orders")
From Hex
Read Hex semantic models into Sidemantic:
from sidemantic.adapters.hex import HexAdapter
# Import from Hex YAML
adapter = HexAdapter()
graph = adapter.parse("hex/models/") # Directory of YAML files
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT orders.revenue FROM orders")
From Rill
Read Rill metrics views into Sidemantic:
from sidemantic.adapters.rill import RillAdapter
# Import from Rill YAML
adapter = RillAdapter()
graph = adapter.parse("rill/metrics/") # Directory of YAML files
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.compile(metrics=["orders.revenue"])
From Superset
Read Apache Superset datasets into Sidemantic:
from sidemantic.adapters.superset import SupersetAdapter
# Import from Superset YAML
adapter = SupersetAdapter()
graph = adapter.parse("superset/datasets/") # Directory of YAML files
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT total_revenue FROM orders")
From Omni
Read Omni Analytics views into Sidemantic:
from sidemantic.adapters.omni import OmniAdapter
# Import from Omni YAML views
adapter = OmniAdapter()
graph = adapter.parse("omni/") # Directory with views/ subdirectory and model.yaml
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT total_revenue FROM orders")
From BSL
Read Boring Semantic Layer models into Sidemantic:
from sidemantic.adapters.bsl import BSLAdapter
# Import from BSL YAML
adapter = BSLAdapter()
graph = adapter.parse("bsl/models/") # Directory of YAML files
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT orders.revenue FROM orders")
# Export back to BSL (roundtrip supported)
adapter.export(graph, "output/bsl_models.yml")
From Malloy
Read Malloy semantic models into Sidemantic:
from sidemantic.adapters.malloy import MalloyAdapter
# Import from Malloy
adapter = MalloyAdapter()
graph = adapter.parse("models/") # Directory or single .malloy file
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT orders.revenue FROM orders")
From Snowflake Cortex
Read Snowflake Cortex semantic model YAML into Sidemantic:
from sidemantic.adapters.snowflake import SnowflakeAdapter
# Import from Snowflake semantic model YAML
adapter = SnowflakeAdapter()
graph = adapter.parse("snowflake/semantic_model.yaml") # File or directory
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT orders.total_revenue FROM orders")
From OSI (Open Semantic Interchange)
Read OSI YAML semantic models into Sidemantic:
from sidemantic.adapters.osi import OSIAdapter
# Import from OSI YAML
adapter = OSIAdapter()
graph = adapter.parse("osi/semantic_model.yaml") # File or directory
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT orders.total_revenue FROM orders")
# Export back to OSI (roundtrip supported)
adapter.export(graph, "output/osi_semantic_model.yml")
From AtScale SML
Read AtScale SML repositories into Sidemantic:
from sidemantic.adapters.atscale_sml import AtScaleSMLAdapter
# Import from an AtScale SML repository directory
adapter = AtScaleSMLAdapter()
graph = adapter.parse("atscale_repo/")
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT sales.total_sales FROM sales")
# Export back to SML repository structure
adapter.export(graph, "output/atscale_repo/")
From GoodData LDM
Read GoodData LDM JSON into Sidemantic:
from sidemantic.adapters.gooddata import GoodDataAdapter
# Import from GoodData LDM JSON (cloud or legacy)
adapter = GoodDataAdapter()
graph = adapter.parse("gooddata/ldm.json") # File or directory of JSON files
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT orders.total_revenue FROM orders")
# Export back to GoodData LDM JSON
adapter.export(graph, "output/gooddata_ldm.json")
From Holistics AML
Read Holistics AML semantic models into Sidemantic:
from sidemantic.adapters.holistics import HolisticsAdapter
# Import from Holistics AML
adapter = HolisticsAdapter()
graph = adapter.parse("holistics/") # Directory or single .aml file
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT orders.total_revenue FROM orders")
# Export back to AML files
adapter.export(graph, "output/holistics/")
From ThoughtSpot TML
Read ThoughtSpot TML tables/worksheets into Sidemantic:
from sidemantic.adapters.thoughtspot import ThoughtSpotAdapter
# Import from ThoughtSpot TML
adapter = ThoughtSpotAdapter()
graph = adapter.parse("thoughtspot/") # Directory or single .tml/.yaml file
# Query with Sidemantic
from sidemantic import SemanticLayer
layer = SemanticLayer()
layer.graph = graph
result = layer.sql("SELECT sales.total_revenue FROM sales")
# Export back to ThoughtSpot TML
adapter.export(graph, "output/thoughtspot/")
Import Mapping
These sections describe how each format's concepts map to Sidemantic when importing.
Cube
cubes→modelsdimensions→dimensionsmeasures→metricsjoins→relationships(inferred from join definitions)${CUBE}placeholder →{model}placeholdersegments→segments(native support)- Calculated measures (type=number) → derived metrics
rolling_window→ cumulative metrics
MetricFlow
semantic_models→modelsentities→ inferredrelationshipsdimensions→dimensionsmeasures→ model-levelmetricsmetrics(graph-level) → graph-levelmetrics- Segments/hierarchies from
metafield → preserved
LookML
views→modelsexplores→relationships(parsed from join definitions)dimensions→dimensionsdimension_group→ multiple time dimensions (one per timeframe)measures→metricsfilters(view-level) →segmentsderived_table→ model with SQL${TABLE}placeholder →{model}placeholder- Measure filters parsed from
filters__all - Foreign keys extracted from
sql_onin explore joins
Hex
- Model
idandbase_sql_table/base_sql_query→models dimensionswithexpr_sqlorexpr_calc→dimensionsmeasureswithfunc/func_sql/func_calc→metricsrelationswithjoin_sql→relationships- Measure
filters(inline or referenced) → metric filters unique: truedimensions → primary key detectiontimestamp_tz/timestamp_naive/datetypes → time dimensions
Rill
metrics_view(type) →modelsdimensionswithcolumn/expression→dimensionsmeasureswithexpression→metricstimeseriescolumn → time dimensionsmallest_time_grain→ time dimension granularity- Derived measures (
type: derived) → derived metrics - Simple aggregation expressions parsed with sqlglot
Superset
table_name→ modelnameschema+table_name→ modeltablesql→ modelsql(for virtual datasets)columns→dimensionsmetrics→ modelmetricsmain_dttm_col→ time dimension detectionverbose_name→labelfieldis_dttmflag → time dimension typemetric_type→ aggregation mapping (count, sum, avg, etc.)
Omni
name(view) → modelnameschema+table_name→ modeltablesql→ modelsql(for SQL-based views)dimensions→dimensionsmeasureswithaggregate_type→metricstimeframes→ time dimension granularitylabel→ modeldescription(if no description field)${TABLE}placeholder →{model}placeholder${view.field}references → simplified field references- Measure
filters→ metric filters relationships(from model.yaml) → model relationships
BSL (Boring Semantic Layer)
- Model keys (top-level YAML) →
models table→ modeltabledimensions→dimensionsmeasures→metricsjoins→relationships_.columnexpression →sql: "column"_.column.sum()→agg: sum, sql: "column"_.column.mean()→agg: avg, sql: "column"_.column.nunique()→agg: count_distinct, sql: "column"_.count()→agg: count_.column.year()→EXTRACT(YEAR FROM column)is_time_dimension: true→type: "time"smallest_time_grain: "TIME_GRAIN_DAY"→granularity: "day"is_entity: true→ primary key detection- Calc measures (referencing other measures) → derived metrics
Malloy
sourcedefinitions →modelsdimensionfields →dimensions(type inferred from expression/name)measurefields →metrics(aggregates and derived expressions)join_one/join_many→relationships- Source-level
whereclauses →segments pick ... when ... else ...→ SQLCASEexpressions- Imported
.malloyfiles are resolved recursively (with cycle protection)
Snowflake Cortex
tables→modelsbase_table(database/schema/table) → modeltabledimensions→dimensions(categorical/numeric/boolean by data type)time_dimensions→dimensionswithtype: "time"facts(default_aggregation) → simplemetrics- Table
metricsexpressions → simple or derivedmetrics - Table
filters→segments - Semantic-model-level
relationships→ modelrelationships
OSI (Open Semantic Interchange)
semantic_model[].datasets→models- Dataset
fields→dimensions(time viadimension.is_time) semantic_model[].metrics→ graph-levelmetricssemantic_model[].relationships→ modelrelationships(many-to-one)- Multi-column keys in
primary_key,unique_keys, and relationships are preserved ai_contextandcustom_extensionsmap into Sidemanticmetafields
AtScale SML
datasetobjects →modelsdimensionlevel attributes/hierarchies →dimensions(including parent hierarchy links)metricandmetric_calcobjects →metricsmodelrelationships → modelrelationshipsaggregates→pre_aggregationsdrillthroughs→ metricdrill_fields
GoodData LDM
- LDM
datasets→models attributes→dimensionsfacts→metrics- Dataset
references→ modelrelationships dateInstances/dateDimensions→ date models with time dimensions- Cloud and legacy JSON structures are both supported
Holistics AML
Modelblocks →modelsdimensionblocks →dimensionsmeasureblocks →metricsRelationshipblocks and dataset relationship configs → modelrelationships@sql/@aqldefinitions → SQL expressions (AQL translated to SQL)extend,PartialModel, anduseimports are resolved during parse
ThoughtSpot TML
tableTML columns → modeldimensionsandmetricsworksheetTML tables/joins/formulas/worksheet columns → SQL-backed model with relationshipsjoins_withand worksheet joins → modelrelationships- Date bucket settings → time dimension granularity
- Unsupported aggregation functions are imported as derived metric SQL
Validating Imports
Always validate after importing:
# Import
graph = adapter.parse("source.yml")
# Verify models loaded
print(f"Loaded {len(graph.models)} models")
for name, model in graph.models.items():
print(f" {name}: {len(model.metrics)} metrics, {len(model.dimensions)} dimensions")
# Verify metrics
print(f"Loaded {len(graph.metrics)} graph-level metrics")
# Test query generation
layer = SemanticLayer()
layer.graph = graph
sql = layer.compile(metrics=["orders.revenue"])
print("Generated SQL:", sql)
Getting Help
If you encounter issues with format conversion:
- Check the compatibility table for known limitations
- Validate your source format is correctly structured
- Test with a simple model first before converting complex definitions
- File an issue at github.com/sidequery/sidemantic with:
- Source format and file
- Expected vs actual behavior
- Generated SQL or error messages