Page MenuHomePhabricator

Provide a more friendly error for incorrect usage of OutputDefinition
AbandonedPublic

Authored by natekupp on Fri, Aug 23, 6:01 PM.

Details

Reviewers
alangenfeld
Group Reviewers
Restricted Project
Summary
Test Plan

unit

Diff Detail

Repository
R1 dagster
Branch
output_err_msg
Lint
Lint OK
Unit
No Unit Test Coverage

Event Timeline

natekupp created this revision.Fri, Aug 23, 6:01 PM
natekupp retitled this revision from Add OutputDefinition error message to Provide a more friendly error for incorrect usage of OutputDefinition.Fri, Aug 23, 6:03 PM
natekupp added a reviewer: Restricted Project.
alangenfeld requested changes to this revision.Fri, Aug 23, 6:28 PM
alangenfeld added a subscriber: alangenfeld.

its definitely valuable to put in extra work in making sure the error messages good here

python_modules/dagster/dagster/core/definitions/output.py
39

hm this is a bit specific - there are probably more likely errors of just referring to types that are not set up correctly or using a value on accident.

I think if we just do a better job with the error message here, we don't need to mention "did you mean Output" since you will see such clear context about failing to create an output definition.

If the handling is done in resolve_to_runtime_type it should also make good error messages for InputDefinition as well

python_modules/dagster/dagster_tests/core_tests/execution_tests/test_step_outputs.py
11

other test cases to make sure we error effectively on:

  • passed a value instead of type during solid definition
  • passed a class that isnt a dagster type
  • same for InputDefinition
26–27

i would still keep this test case around - you can use the match argument to pytest.raises and just match against a critical chunk of the error message to prevent having to update tests to do minor wording changes

This revision now requires changes to proceed.Fri, Aug 23, 6:28 PM
natekupp abandoned this revision.Wed, Aug 28, 8:07 PM

I'm actually going to just abandon this, we can come back later - the real issue is that pandas.DataFrame raises an exception if you try to compare it to anything, which breaks a lot of our value-detection code. I gave a shot at fixing it but it's not obvious to me how to correct without introducing some really gross stuff