You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For 1.6 our plan is to have a basic integration of MetricFlow with dbt-core for the semantic layer. What this integration looks like for dbt-core 1.6 is the following
core will parse these objects to the manifest as nodes
core will output a new semantic manifest artifact
adapters will allow for the optional installation of MetricFlow (i.e. $ pip install dbt-snowflake[metricflow]) (i.e. $ pip install "dbt-metricflow[snowfalke]")
querying will happen via the MetricFlow CLI for 1.6 (e.g. $ mf query <metric_name> ...
The content you are editing has changed. Please copy your edits and refresh the page.
For 1.6 our plan is to have a basic integration of MetricFlow with dbt-core for the semantic layer. What this integration looks like for dbt-core 1.6 is the following
(i.e.(i.e.$ pip install dbt-snowflake[metricflow]
)$ pip install "dbt-metricflow[snowfalke]"
)$ mf query <metric_name> ...
[P0] End-to-end
Protocol
s #7470[P0/1] Follow-ups from end-to-end
expr
#7865use_approximate_percentile
anduse_discrete_percentile
keys defualt to None but are defined as bool in the protocol #7866semantic_nodes
->semantic_models
#7907depends_on
#7854create_metric
property from end user facingMeasure
spec #8064filter
attrs onMetric
nodes #8065[P0] Migration support
[P1] Adapters
explain
ordry_run
method to BaseAdapter #7839The text was updated successfully, but these errors were encountered: