- Adds a test for the hammer pipeline, running on dask, through the dagit codepath.
- Allows users to pass an instance to the make_airflow_dag constructor -- events returned
you should be able to get the instance from here https://dagster.phacility.com/source/dagster/browse/master/python_modules/dagster/dagster/core/execution/context/system.py$164-166
i think its probably better to enforce it be passed as opposed to defaulting to ephemeral
I would like to see this client eventually be user-facing (like, for people who want to noodle around with GraphQL but not do all the parsing and error handling and query construction themselves), so I'm not sure about this.
why are we doing airflow changes in a diff titled "Fixes for Dask execution"?
i think we should do
instance = check.inst_param(instance) if instance else Dagster.get(...)
because that function does some real work
i still feel icky when i see business logic against the environment dict
@alangenfeld and I discussed this -- the alternative is to push knowledge about the Airflow executor deeper into the core. we don't have a good abstraction rn to express semantic limitations/predicates on elements of the config dict beyond their types, so engines, etc., can't declare "I need these restrictions above and beyond typechecking". if we don't perform these checks / substitutions here, then perfectly valid config (which works fine against the in-process executor) will appear fine against airflow, but then lead to opaque failures in the middle of the pipeline when intermediates are not available.