Changeset View
Changeset View
Standalone View
Standalone View
docs/content/concepts/solids-pipelines/pipeline-execution.mdx
Show First 20 Lines • Show All 159 Lines • ▼ Show 20 Lines | |||||
```yaml file=/concepts/solids_pipelines/multiprocessing.yaml | ```yaml file=/concepts/solids_pipelines/multiprocessing.yaml | ||||
execution: | execution: | ||||
multiprocess: | multiprocess: | ||||
``` | ``` | ||||
Note that because solids will be executed in separate processes, the values passed between solids also need to be stored in a way that they can be accessed across processes. For example, you can configure a <PyObject object="fs_io_manager" /> to be the pipeline-wide IO manager so the intermediate values can be persisted in the filesystem: | Note that because solids will be executed in separate processes, the values passed between solids also need to be stored in a way that they can be accessed across processes. For example, you can configure a <PyObject object="fs_io_manager" /> to be the pipeline-wide IO manager so the intermediate values can be persisted in the filesystem: | ||||
```python file=/concepts/solids_pipelines/pipeline_execution.py startafter=start_parallel_pipeline_marker endbefore=end_parallel_pipeline_marker | ```python file=/concepts/solids_pipelines/pipeline_execution.py startafter=start_parallel_pipeline_marker endbefore=end_parallel_pipeline_marker | ||||
@pipeline(mode_defs=[ModeDefinition(resource_defs={"io_manager": fs_io_manager})]) | @pipeline( | ||||
mode_defs=[ModeDefinition(resource_defs={"io_manager": fs_io_manager})] | |||||
) | |||||
def parallel_pipeline(): | def parallel_pipeline(): | ||||
total(return_one(), return_one(), return_one(), return_one()) | total(return_one(), return_one(), return_one(), return_one()) | ||||
``` | ``` | ||||
See the [IO Managers](/concepts/io-management/io-managers) for more details about configuring IO Managers. | See the [IO Managers](/concepts/io-management/io-managers) for more details about configuring IO Managers. | ||||
Besides, using a multiprocess executor will involve reconstructing the pipeline in another process, so the Python APIs will need a reconstructable instance of the pipeline. This work is handled for you when using Dagit or the Dagster CLI. | Besides, using a multiprocess executor will involve reconstructing the pipeline in another process, so the Python APIs will need a reconstructable instance of the pipeline. This work is handled for you when using Dagit or the Dagster CLI. | ||||
Show All 24 Lines |