-
Notifications
You must be signed in to change notification settings - Fork 171
Proposal: required fields and related issues #822
Comments
One motivation for this breaking change is "good API design", ok but what about configuration ? Most of my configuration inputs are required values. Am I the only one ? |
Can you give a concrete example (ideally real-life) where you would expect things to be concrete? Also note that the plan is to not initially change it and give people plenty of opportunity to play around with it. But would be great to see good arguments in favor of the current semantics. |
@eonpatapon: part of the reasoning: either configurations are small, in which case it won't hurt to specify a few extra !'s, or they are not, in which case the likelihood that all fields are required is tiny, while specifying ? everywhere is both tedious and error prone. |
Anecdotal experience from my DevOps work... When I look at Terraform, Ansible, Kubenetes, Helm, cloud provider APIs, and a host of other programs, I find very few required fields and many more optional fields that typically have defaults. When creating configuration for tools I develop in this role, I try to follow this pattern as well. |
Thanks for the extensive context given in the proposal. I'm a bit concerned with the direction of CUE after reading it, though, as much of the justification for the suggested changes seems to be either about performance or about cases in which the most obvious approach doesn't work as it implies. I'll exemplify the concern with a few quotes:
The intuitive notion we all have is that things are either required or optional, without in betweens. I realize that this is really concrete vs. non-concrete and present vs. non-present, but if we have to explain optional fields -- such a simple idea -- to the reader and point to a spec, we've lost a shot in terms of design. This is such a visible issue that even the proposal itself confuses the point by calling it required sometimes, and the chosen syntax reinforces the problem by them being opposites (! vs. ?). Intuitively, it seems fair to say that when one expresses something as simple as this:
What is meant here is that the outcome should be Now, speaking specifically about whether optional is more frequent than required or not, we could easily preserve the above semantics and have Demonstrating these points with another quote from the proposal:
This is a great example because it seems hard to argue about what the expression should mean. Instead of adding syntax to enable what people want to do to be possible, it would be better to make sure that this does what people think it would do. Another one:
Again, similar idea: there's little doubt when reading that query that the user did in fact expect name to be present in the document as a concrete value if the query was done against a body of source to extract a known value. It is only the lack of context that generates the possibility of One more:
Same idea: seeing both It's worth saying I'm fresh on CUE, which means I may still be misunderstanding concepts and be missing important design decisions, but it also means that for now I still speak as most people that spend less time on it would see it. Those groups are an important audience for projects I oversee more closely, at least. As a simplistic suggestion, given the issues I've learned about in this proposal, I would look into transforming the behavior of the bare Again, thanks for the clearly stated proposal, and I apologize for speaking without much background at this stage. Hopefully I didn't miss the key ideas too much. |
Like @niemeyer, my first reading of this proposal had me feeling unworthy. So much of it seemed backwards and confusing to me, but I left it assuming that I don't understand CUE well enough yet to see why it should be this way. |
Another thought, again echoing @niemeyer's statements: I read "required" and "optional" as antonyms, with no third choice. Including both a required designator ( I find it most intuitive to treat an unadorned field as required, with a special adornment to indicate that a field is optional. |
@seh and @niemeyer, thanks for the feedback. I'm not entirely happy, of course, about both flags being present, but so far I have not been able to find a satisfactory solution with only one of those. The current semantics does not function well for expressing policy as well as a slew of constraints. The introduction of Part of the issue is exactly having data and types in a single view. It simplifies a lot, but it does sometime require some extra expressiveness in clarifying intent. Perhaps the best way forward is to collect a few key examples that expose the entire set of problems and requirements to see if there is a solution where all cases can be satisfied by using either Of course I would be very open to a simpler solution that satisfies all these cases. |
All the big thoughts come to me after I write something here, so take whatever I'm writing this time with a grain of salt. As you noted, I can see how CUE's various roles for fields—or for the values in fields—makes this more complicated than something like Protocol Buffer messages. CUE fields can include default values, which makes them "optional" in the sense that no one has to provide a value, but the field might still be required to be present in an exported view. That's different from an optional field that has no default value, but could be omitted from an exported view if it lacks a concrete value. A few questions:
|
@seh, note that this proposal does not change the semantics of optional fields (using ?):
Nope, it would remain optional. A default value of an optional field is only relevant if unified with a non-optional field with a more general value, that leaves the default and non-default value. The same would hold for required fields with optional values. I haven't really seen a use case for this behavior in practice. It's just a consequence of keeping these constructs orthogonal. |
Same applies here. For both ! and ? specifying a concrete value does not make it a regular field. So Similarly, |
When you say "must be specified," is it necessary for someone to restate that, say, the "kind" field's value is "Service," to confirm the required value, or is that If you instead wrote |
Indeed that would make it easier to explore options. It would also be useful to collect some context about the use case, as given the proposal above it seems that part of the issue is contextual and perhaps we can do without some of the syntax by taking that into consideration.
It's easy to see the motivation for both of these use cases. What is not clear is what else sits in between them. That is, I'd guess most people would think that specifying Going back to the proposal to illustrate both this and the note above on contextual behavior, we see that example:
Isn't the following a more natural way to express the intent above:
Per earlier comment, the error is contextual since the latter expression remains valid while manipulating the state. It is the action on that value -- exporting, vetting, etc -- that turns the lack of concrete values into an error. Can't we take that context into account, instead of adding syntax to express the same thing in a slightly different way? Depending on use cases, it could be useful to expose some of that contextual difference to the language by having a built-in that does the same operation to a specific value. Using the example above, it might be possible to offer something along the lines of Again I need to add that disclaimer here pointing out that I don't have knowledge of the engine implementation at this time, and I'm trying to understand both the problem and the space of feasible solutions by extrapolating the proposal above. |
A correction to the above example:
The presence of |
@seh: with "must be specified," I mean indeed that it must be specified and cannot be omitted. So it will not be the same as just saying |
@mpvl We both understand that. What we're pointing out is that the intuition is a bare string being in fact required, and reading that issue (#740) I can see @narqo was saying precisely the same there:
Emphasis on feels unintuitive. He's not asking for a different feature. He's saying the current language surprises him, which is what I've been exploring above. We already have three different constructs that surround this behavior:
We shouldn't need a fourth "actually required" syntax, but rather just fine tune the existing one to match intuition a bit more closely. |
In case it isn't clear, the problem is that non-optional fields are often used to hold calculated values that aren't necessary to provide explicitly.
This is currently fine: If If we removed the existing behaviour, it would be a radical change to CUE. For example it would mean that |
I'm still missing something about Roger's point. In the example above, |
Also, at risk of putting words into @narqo's mouth from #740, I do now see that he intended that each
My current interpretation of the meaning of his I don't understand the motivation to require each |
Can you give an example of how you would express the issue in #740 with the constructs that are currently there? What kind of changes do you see that making possible? |
Here's a concrete example (you can run it with the testscript command):
In In other words, it's concrete in the exported data but not required in the input data. AIUI, the |
@mpvl I'm still trying to understand use cases and details of CUE itself, which means my proposal will likely miss important points, but with the conversation so far and the examples from the proposal, what if: 1. Regular fields become required In other words, the behavior for the proposed
So this sorts out #740, but opens up problems in current usage. So let's fix those. 2. Default values are extended We already have the idea of default values today, which is something people understand well from past experiences. So, this already works fine today and we'd keep it working as-is:
In addition, we'd extend default values so that providing a single default is supported:
This matches what we have today as regular fields in definitions. The field is required per point 1, but has a default. 3. Optional fields are untouched Doesn't look like we have any issues with those, so they'd stay as-is. ... As I said, I'm probably missing important details. What issues would that create? |
This interpretation works well for configuration generation, but it is unsatisfactory for validation or policy in general. A very simple but important use case is to be able to ask the question: "Is this yaml file valid for consumption by some service as is?". This cannot be specified currently. What is currently possible, and what concurs with your interpretation, is to answer "is this a valid YAML file for consumption after unifying it with a CUE schema converting it to YAML again?". But this is not the use case that is being addressed here. This extra unification step if often not an option. As far as I can tell, this is unsolvable with the current CUE. The use case in #740 is a pure validation use case. In the analysis of using CUE as a policy language as well as within the realm of querying, this inability gets more pervasive and also becomes more an intertwined problem where interpretation of meaning within the context of a certain CUE command will not help. I would say that this is the most important issue addressed by the required proposal. It would be possible to solve this without a language change, for instance Overall, the suggested semantics of In a nutshell, so far I can see a transition path to introduce I know I should compile more examples to expose the issues. In the meantime, though, seeing how #740 could be solved as a first step without the use of Not saying it isn't possible. In fact, some recent thoughts on CUE patterns and discriminator fields, as well as more reliance on |
Thank you for the "concrete" example. I see that specifying the We are missing the ability to do something. I don't know why someone would want to do that thing, so it's hard to think through a good design for a feature that I don't think should exist. If this feature did exist, I expect that I'd find it annoying. When it's time to write file foo.json, why do I need to restate a value that's fixed by other declarations? |
I wrote the comment above before reading @mpvl's #822 (comment), but got interrupted before I could post it. You are correct that I've been ignoring use of CUE to validate input data. My only use of CUE so far has been to help generate output data more concisely and correctly. That one language must carry the burden of checking input data while not augmenting it into compliance and also augmenting data for output makes for a difficult design, as you're struggling with here. Well, it's not so much that you're struggling to design it; the rest of us are struggling to figure out when each concern and feature matters. |
@seh To be clear, I actually see the issue, and there are apparently at least two of them:
Hopefully we can address both and still eat cake at the end of the day.
I'm curious about that option. If we can have only two distinct behaviors, wouldn't having "foo" and "foo!" be equivalent to having "foo?" and "foo", respectively? Wouldn't it be just syntax?
Sometimes that feels true, but in the above examples that's often not the case. For example, I hope we can type such expressions in the CLI eventually, and having to tell people to say Or do I misunderstand the examples? (Please note I've replied above simultaneously. Just want to make sure the reply doesn't get lost due to UI.) |
There are some interesting suggestions there I hadn't considered. As usual with CUE, one has to consider what the implications are for the value lattice. But some quick observations:
|
We've been writing back and forth concurrently, so I'll start by saying that I'm writing in response to #822 (comment). When validating input data that we don't intend to augment, the CUE field values express constraints the input data must meet. If the constraint is a concrete value, the input data must have that same concrete value. When generating output data that we can augment, the CUE field values express recipes to generate output data, safely combined with any starting values. Is it correct that we use unification during both validation and export? I take it that the problem is that with a regular field today that has a concrete value, unifying it with input data that lacks that field just causes the field to come into being, as opposed to catching that the field was absent in the input data. |
The merging of two values is a power feature which is great indeed, and not worth breaking. But definitions are already more schema-like than value-like, so flipping their regular fields to become a strict concrete requirement there unless adorned by the default marker would not break that aspect, or would it?
Agreed. If we do need to adorn the key or value, the original |
@mpvl Here is another perspective that occurred to me overnight, which I believe addresses #740 and is probably the least disruptive change to the language thus far. We've been thinking about that issue as "required concreteness", but as we discussed a few times above the idea of a required field already exists in the language, so that creates some confusion. What's curious is that the stated problem happens precisely because the field is concrete, which means it gets unified concretely instead of remaining a requirement constraint. So what if we think about it in terms of "required abstractness"? That's what we have normally and why despite unifications the idea of requirement isn't lost:
So wouldn't a simple solution be to transform the concrete value into an abstract one, using analogous language:
For the spec this just means turning That would prevent the unification of a concrete value, align with existing language, and make exporting and vetting that field able to catch the issue as it already does for required/regular fields which weren't properly defined. |
Yes, that is correct. There are a few reasons for that:
|
I believe you are suggesting here we could interpret definitions differently. Even if not, the following may provide useful context. There are in fact many things we can do if we go that road. To some extent we already do so, as definitions conflates the notions of "not being output during export" and "type checking on closed fields is enabled". The decision to conflate there was done with big trepidation and thorough consideration. In general, this does not seem to be a good idea. That said, I have considered it in this context as well and definitely don't want to rule it out. It is actually a whole new universe of neat solutions in that direction. Nonetheless, when working this out I ran into some issues that seemed to spell danger, so I backtracked and focussed on a solution that does not overload the meaning of definitions further. Again, that doesn't mean this cannot be a direction to take. |
That is a really neat idea! I always like it to use existing possible syntax when possible. I see one potential snag. The So using
and
Although for structs this may not be necessary. Note that the way structs work now is one of the warts of the current optional semantics. If a piece of JSON really doesn't need to specify a struct, it should be written as
But that is not necessary. That probably was the wrong choice, as it is rather inconsistent. If we were to fix the optional semantics, things would look more like:
This looks quite ugly to me, and provides quite a bad UX imo (it is easy to forget the It would on the other hand, make solving the |
I don't find that ugly, or beautiful either. I find it consistent and readable, which seems much more valuable in the context of CUE. I bet that if we share that snippet with someone that understands JSON but has never seen CUE before, they'll be able to guess exactly what sort of operation would be applied. The same is true for the proposed It also looks like dropping that default-to-empty behavior would fix a couple of the issues mentioned in the original proposal above too, so again feels like a move in the right direction on itself. So I guess I do find it beautiful after all. Not in terms of syntax, but in terms of solution. Reminds me a bit of conversations in the early days of Go itself. |
Ah, I see @rogpeppe point now. So here are some other things to consider:
Roger actually points out a very serious problem with the alternative approach. As Rog mentioned, it would still not be possible to default all non-concrete fields to required. A common use case is to have fields that have some expression that evaluates to a concrete value. For instance, the desired semantics for In other words, one can perhaps treat a field like
versus
Adopting the "sometimes required" semantics for regular fields, one would not be able to tell from the context whether something is required. Personally, I think it is quite a loss having to chase references to see figure out the semantics of a field.
I believe this was one of the reasons why we considered the LHS required specification to be superior over the RHS alternative. Note that aside from these issues, as @rogpeppe mentions, and I alluded to with "I can't see a transition path", a change in these semantics is a massive change to CUE. In contrast, adding This issue actually gets to the core of why this proposal is the way it is. Let's consider it in terms of types of fields we need:
Type 1. is less common, but important when it occurs and a must have. It will be more prevalent as CUE is used more for policy checks and querying. This is the semantics proposed with In the current implementation there is only 2, 3, and 4. If we don't introduce the notion of a required field, it is almost unavoidable to thinker with 2 and 3, as you've noticed, meaning we have to change the meaning of existing constructs in CUE, with all its consequences. Note also that the usage of Also note that the use of In fact, getting rid of With the current proposal, it is possible to add To take the route of only keeping Basically, the fact that To be fair, note that both paths actually change the meaning of regular fields. The big difference, though, is that the required field approach only changes this interpretation at the tooling level, becoming more permissive during Finally, more forward looking, one also has to consider symmetry with the query language. At the query level, it seems very natural to think of terms concrete versus non-concrete. Almost all JSON query languages will silently skip selections in a query projection if this field does not exists (there concreteness translates to present or not). So the most natural behavior seems to drop an element in In general, one observation was that the way one would think in terms of optional, required, concrete etc. in the context of the proposed query extension, quite closely matches the somewhat different way of thinking of optional vs required in this proposal. There is a reason that the query proposal hasn't yet been implemented and that the required proposal heavily refers to it. :) Anyway, again, the solution space is huge, so there can always be alternatives. Anyway. Just providing a few more data points. :) There are certainly remaining issues with |
I had many thoughts overnight, and found all of the intervening discussion this morning. I read all of it. Still, I am struggling with getting past this idea: A regular field pushes or creates data during export, and pulls and requires data during validation. When validating, it expresses a requirement of the input data. If the input data is absent, it can't satisfy the requirement. When exporting, it ensures that the exported data would satisfy the same validation rule coming back in later. If I write: kind: "Service" and I'm validating input, I'm demanding that the "kind" field is present and has the value "Service." If I write that and I'm exporting, I'm ensuring that the exported data has a "kind" field with the value "Service." If I didn't care whether or not the input data had a "kind" field, I could write As an author of CUE code, that directional treatment makes sense to me. What am I missing that we can't express that way? |
@seh. There are many moving parts and it is indeed easy to hold a view where things work different. I understand where you are coming from. I'll try to explain from that angle. The problem discussed in #740 can indeed be solved in various ways. The direction that this proposal takes is to have a single interpretation of CUE for all commands. A different approach, indeed, could be defining different behavior for different operations. In fact, There are many problems with the subsumption approach, though. With unification, a single operation satisfies all use cases. But subsumption needs to behave subtly different depending on the use case. The variables are: include optional field in parent/child?, pick defaults for parent/child?, project onto closed equivalent for parent/child? That's already 64 different forms of subsumption (not all make sense, granted), and there are probably more. None of this is relevant when solving the #740 and related issues with unification instead. The trick here is that the ambiguity is resolved by making configurations more explicit in intent. Note that the required/regular field approach is not just a syntactic change over the regular/optional field approach. It is this slight change in semantics that makes this possible. Another big problem with subsumption is that it is NP-complete in its general form. Not only that, its correct implementation is rather hard. There is currently no accurate implementation of subsumption for CUE: all applications where subsumption is used it suffices to allow false negatives. This is why the Note that, somewhat surprisingly, although the unification is expressed in terms of subsumption, it does not suffer from the same problems, with one exception: needing to determine whether regular expressions intersect to the empty set. This issue only manifests itself with Then a final problem with using subsumption is a matter of UX. Personally, I use As we probably need to introduce such a command anyway, if only to allow for backwards compatibility checks for apis, it may not be a bad idea to add regardless. It would be interesting to consider either way how this would look like in combination with this proposal. On top of these fundamental issues, there are a few practical issues going the subsumption route. Using subsumption still doesn't solve expressing an equivalent for Using Finally, the benefit of using unification requires CUE schema to be more precisely defined (or more accurately: the use of |
@mpvl I'm commenting mainly to give a solid +1 to that entire last message. Thanks for taking the time to explain, and really to think about all these corners of that foundational problem. I still wonder whether certain apparently simple cases such as querying will be confusing because the implied meaning for a field name there is for it to be present. But even if that turns out to be a problem in practice, it should be easier to solve it once we are in a position of consistent proper behavior, as suggested in that last comment, than from where we are today. |
We wondered the same. The reasoning that this is probably okay is that
It makes sense to enable an initial implementation for querying (or any of this, really) as an experiment or in some other way that allows us to make clear that the exact meaning may change. |
It's been almost a month since I last wrote here, and I finally ran into a situation that I think motivates required fields. I have a JSON file generated by a separate tool, and I want to consume that JSON file in a CUE instance. In that instance, I wrote a definition capturing the approximate schema to which I need that JSON file to conform. I'm using a pattern constraint for field names when defining the JSON schema. Later, I refine that general schema for the file format with a set of field names that I expect to be present in the input file. "TF" here means Terraform. #TFOutput: [string]: {
sensitive: bool
// This is loose, but captures enough of the possibilities.
type: string | [(string | [string, ...]), ...]
value: _
}
#AWSTFOutput: #TFOutput & {
let objectType = {
type: ["object", ...]
}
output_name_1: objectType
output_name_2: objectType
} Here, This works to catch either of those output values being present but of the wrong type, but it doesn't catch either of those output values being absent. I suspect that's due to an interaction with the pattern constraint in the |
How do you vet the json file against this schema ? For me this should work if you check for concreteness. It should fail for example if |
That is, if you're using a version of CUE that accepts '!', right? I tried it with version 0.4.0, and it complained like this:
|
Indeed. There is no version of CUE that implements this proposal as yet. |
I found that it doesn't fail when I try to access |
This issue has been migrated to cue-lang/cue#822. For more details about CUE's migration to a new home, please see cue-lang/cue#1078. |
With extensive inputs from @myitcv
We propose a couple of small additions to the language, as well as a sharpening of the language specification, to solve a myriad of shortcomings related to specifying required and optional fields. We believe the result allows for a much more natural representation of required fields in CUE. This proposal also allows us to define tooling in a more consistent manner and lays the foundation for finalizing the query proposal.
The changes are mostly backwards compatible, but:
-c
flag forcue eval
andcue export
to get the same results with the old representation. This makes this flag consistent with the behavior ofvet
as well.Automated tooling can be provided to rewrite CUE files.
Background
We now cover some core CUE concepts necessary to understand the proposal introduced in this document. In some cases, it points out general issues with these topics. People familiar with these concepts can skip this section.
Closeness
CUE closedness is intended to solve two problems:
A specific non-goal of defining closed structs is to define APIs that promise to never add an additional field. CUE does not offer a structured way to specify this, as it is believed that APIs should be open by definition.
A good analogy of the role of closedness in CUE is to compare it to Protobufs. By definition, protobufs are extendible. Fields with unknown tags on incoming messages should typically be ignored. However, when compiling a proto definition to, say, Go, the fields defined on a proto
message is closed. An attempt to access an unknown field in a struct will result in a compile error. In CUE, the fact that a definition closes a struct should be interpreted in the sense of a compiled proto message and not as an indication that an API is closed for extension.
In that sense, enforcing mutual exclusivity in protos is a bit of an abuse of this mechanism. Although it has served the representation of protos quite well, it may help to consider alternatives.
Error types
To understand the below proposal, it is important to realize that CUE distinguishes between several error modes:
string + “foo”
In the end these are all errors, but CUE evaluation and manifestation will fail at different points when encountering these.
Evaluation modes
CUE distinguishes two evaluation modes:
Unification
Unification combines two CUE values, preserving the entire space of possible values that are the intersection of the two input values. Unification is commutative, idempotent, and associative.
For instance,
cue def
presents the result of unification without any simplifications like picking defaults or resolving references if this would change the meaning of the CUE program.There can be multiple ways to represent the same schema. For instance
bool
is equivalent totrue | false
. The goal of unification is not to find a normalized representation of the result. Finding an optimal representation, by some definition, is an option, but not necessity.Default selection
Default selection is used when a full specification of a value is required, but an intermediate concrete value is needed to proceed. Essentially, this happened for references of the form
a.b
anda["b"]
, wherea
needs to be a concrete list or struct to be able to select the appropriate value.The steps are as follows:
The differentiating property of default selection is that all optional fields, pattern constraints and closedness information needs to be preserved. This puts limits on the amount of disambiguation that can be done ahead of time. It is the user’s responsibility to disambiguate structs if applicable.
Manifestation
Manifestation projects a result from unification to a concrete value. It is used in
cue export
, but also, for instance, for:x
andy
inx + y
x.*
(corresponding to the Query Proposal Proposal: CUE Querying extension #165)We will write
manifest(x)
to denote the manifested value ofx
. This is not a CUE builtin.Right now this encompases:
Note that after this elimination different structs with different optional values are retained.
The Issues
We present some issues for the various use cases of CUE that motivated this proposal.
Encourage good API design
When evolving and augmenting an API, it is generally strongly recommended to not introduce backwards incompatible changes. This means that any added fields to an API should typically be optional.
In CUE, optional fields are indicated by a
?
. This means that when augmenting an API, one should always remember to add the?
. Especially for larger APIs, required fields should be relatively rare. It would be better if only the required fields had to be marked, while all other fields are optional. This would avoid inadvertently making fields required, and makes it more likely that making a field required is a deliberate choice.Specifying minimum and maximum number of required fields
The protobuf oneOf representation
OneOf fields in protobufs are currently represented as
This relies on all three disjuncts ultimately collapsing into the same value, assuming that optional fields can be ignored and the closedness to enforce mutual exclusivity.
The following could be used as an alternative.
This approach is a bit of an abuse of the closedness mechanism, whose predominant goals to catch typos. It mostly works, although it does have a few issues. For instance, embedding
#Foo
would allow users to redefine a oneOf field, allowing it to be used in configurations that were not allowed by#Foo
. To some extent this is okay, considering what closedness is for, and the fact that embeddings disables this check. But it may be nice to address if possible.In the end, there is currently no explicit “official” way of specifying
oneOf
fields in CUE that can capture the most likely intent. CUE currently deploys a hacky workaround for this, but that is not a permanent solution.Specifying a certain number of required fields
The above mechanism does not generalize to specifying that, for instance, at least one of a set of fields should be given by a user.
Policy definitions and querying
One of the core ideas of CUE is that, with the right model, constraints can function as templates, resulting in a powerful combination. The way optional and required fields are marked now works fairly well in this context. These same mechanisms don’t work as smoothly in the context of querying and policy definitions, that is, using CUE to specify filters or additional constraints.
There are currently some gaps in the query proposal that can only be addressed with some refinement in the optionality model in CUE.
Requiring fields
Requiring the user to specify a field is currently achieved by specifying a field with a non-concrete value and then requiring non-concrete values to be concrete when exporting. Note that to CUE, non-concrete values are just values. So, in effect, this mechanism is also an abuse of the current semantics. Better would be to mark fields as required explicitly.
Requiring concrete values
A related problem is the ability to require the user to specify a concrete value. This is currently not possible. For instance
says that
intList
must be a list of integers. But since[...int]
defaults to[]
, CUE will happily fill in this value if a user doesn’t specify it. Similarly, requiring that a user specify a concrete value is not possible: unifying a user-supplied value withwould just happily fill in the missing
kind
.Error messages
A side-effect of having a mechanism to explicitly specify a field as required is that it will also lead to better error messages. Currently, when CUE discovers a non-concrete message it just complains about that: it has no way of knowing the author’s intent.
Having an error message explicitly state that the error is a missing field would be considerably clearer.
Evaluation
Disambiguating disjuncts
CUE currently only disambiguates identical disjuncts. This means that disjuncts with different optional fields will not disambiguate. The following is an amended example from #353
As the two values have two different sets of optional fields, they are not considered equal.
This can result in confusing error messages to the user, like
incomplete value
{a: 1} | {a: 1}, for instance when running a command that does not show optional fields, like
cue evalor
cue export`.CUE does not do this as disambiguating optional fields is NP-complete in the general case. It is nonetheless confusing for the user.
A similar issue occurs with regular (non-optional) fields.
This can be solved by giving users better tools to disambiguate as well as having more permissive disambiguation.
cue
commandsAvoid confusion for users
CUE currently allows non-concrete values in the output. Unifying
and
succeeds with vet, because for CUE
int
is a valid value. The-c
option, will verify values are concrete. Forcue export
, however, values must be concrete to succeed.Overall, there are some discrepancies in how optionality and non-concreteness is handled across commands. A more consistent handling across commands would be useful.
What is
cue eval
for? How does it differ from cue export? More clarity for CUE commandscue eval
prints a form of CUE that is somewhat in between schema and concrete data. It may not even be correct CUE. The purpose of this was to get good feedback for debugging. But as CUE evolved this output has become increasingly confusing. A clearer model of output modes is in order.A more consistent output with more explicit debugging options shared between commands seems less confusing.
Proposal
This section gives a brief overview of the proposal. Details are given in the next section.
Required fields
We introduce the concept of a required field, written as
foo!: bar
, which requires that the field be unified with a namesake regular field (not required and not optional) which has a concrete value. A field that violates this constraint results in an “incomplete” error (the failure could be solved by adding a concrete value of the field).Consider this simple example:
Further examples can be seen below.
A required field can be referenced as if it were a regular field:
A required field may have a concrete value. The same limitation holds for such fields, meaning that it must be unified with a regular field with that exact same concrete value.
numexist
builtinWe introduce a
numexist
builtin with the signaturewhich takes a numeric constraint and a variable list of expressions and returns
_
: if the number of expressions evaluating to a concrete value unifies withnum
._|_
otherwise, where the error is an incomplete error if the number could still be satisfied by having more concrete values.When evaluating the expressions, incomplete errors are ignored and counted as 0.
For instance, a protobuf oneOf with fields named
a
andb
could be written as:There is nothing special about the arguments
a
andb
here, they are just resolved as usual. In this case,numexist
passes as botha
andb
evaluates toint
, and thus the number of concrete values is 0, which matches<=1
. It would fail with a fatal error ifa
andb
were both, say,1
.Requiring that at least one of a set of fields is non-concrete can be written as:
In this case,
numexist
fails with an “incomplete” error, as the condition can still be resolved by setting eithera
,b
, or both to a concrete value.Ignoring incomplete errors is needed to allow the same construct for fields that have struct values:
without making
a
andb
optional fields, they would both always evaluate to a concrete value. For consistency, it is probably good style to always mark fields referred to by this builtin as optional.When optional fields are used, as in
#P3
above, we could consider another builtinnumexists
that checks for the existence of a reference, instead of concreteness.More specific manifestation
We propose adjusting manifestation as follows (new or changed steps emphasized):
Step 3 can be optional and only affects the outcome insofar it may lead to different disambiguation for structs. Which to choose may depend on the output mode requested by the user.
Unlike with unification, there is logically only one correct representation (disregarding field order for now).
Tooling alignment
The introduction of required fields and a more specific manifestation can be used to simplify the commands of the tooling layer.
The biggest change is that we will generally not interpret non-concrete fields as errors, making them behave somewhat similar to optional fields today. The old behavior would still be available through the
-c
option. Some commands, likecue vet
have always taken this interpretation. So this change will bring more consistency between commands, while addressing many of the above errors.Details of the command alignment is discussed below.
Detailed design and examples
Required fields
Semantics
A required field, denoted
foo!: bar
, requires that the field be unified with one or more regular fields (not required and not optional) giving a concrete result. A field that violates this constraint results in an “incomplete” error.For instance, consider this example:
This would require that the user specify field
x
which must have a fielda
with a concrete value. Dropping the!
fromx
would mean that users don’t have to specify a fieldx
but when they do, they also need to specify a concrete value fora
.The required constraint is a property of fields. Here are some examples of how required fields unify with regular fields and other required fields:
A total ordering of all types of fields can be expressed as
Note that
{foo!: 3} & {foo: <=3}
: cannot be simplified further. The definition requires that the values of the regular fields together be concrete. Logically,{foo: <=3}
could unify with{foo: >=3}
to become{foo: 3}
(it represents the same set of numbers), so rewriting{foo!: 3} & {foo: <=3}
as either{foo!: 3}
would result in loss of information. Retaining this distinction is important to keep associativity and a valid lattice structure.Limitations
The definition imposes some limitations on the use of the
!
flag. For instance, if a definition were to have both a fieldfoo!: 3
andfoo: 3
, the latter would remove the requirement for the user to specify a field, becausefoo: 3
satisfies the requirement offoo!: 3
to provide a concrete value. It doesn’t matter that this comes from the same definition. In general, when marking a field as required, it is advisable that all other constraints on the same field in a definition (or at least those with concrete values) should specify the!
as well. These cases could be caught with acue vet
rule, requiring that in a single definition all fields must be either required or not. This would also help readability.Implementation
We now present a possible implementation. Understanding this section is not key to the proposal.
The implementation of required fields could be done with an “internal builtin”
__isconcrete
defined as follows:__isconcre(x)
if unified with a valuey
returnsa fatal error if unification fails
_
if it succeeds andy
is concretean incomplete error otherwise
This would be implemented along the lines of current validators (like
<10
) and the proposedmust
builtin.Example rewrites:
Also
__isconcrete(int) & __isconcrete(3)
→__isconcrete(3)
.After this conversion, unification proceeds along the usual lines.
We opted for a syntactic addition to the language and not to have the user define this builtin directly, as its usage is rather confusing. For nested structures we only care about concreteness at the top level. So using
__isconcrete({name: int})
would work, but would do unnecessary work and also give the impression that fieldname
itself should be concrete.Another reason is that people will usually think of this functionality as specifying a
required
field. Technically speaking, though, as CUE treats non-concrete values as “present”, it should be called “concrete” and not “required”. The use of!
avoids this issue.Implications for
?
Note with the addition of
!
, the use of?
would be eliminated in most cases. As we saw for the proposal fornumconcrete
, however, there are still some use cases for it. For this reason, as well as backwards compatibility, we expect that?
will stay.Example: querying
Regardless of whether subsumption or unification is used for querying (see #165), it is currently not possible to specify a field should be concrete but of a certain set of values.
However, with the
!
constraint, we can writeto query all values of
a
that have a name field starting with a lowercase letter.Note, the
!
would not be necessary when using subsumption and querying using concrete values likename: “foo”
. It would also not be necessary if all values ina
can be assumed to be concrete or if including non-concrete values is desirable.Example: policy specification
It is currently awkward to write that a value should either have value of one kind or another. For instance:
would match both variants if a user didn’t specify any of the fields (assuming
a
andb
are valid fields).Using the exclamation mark (
{a!: >10} | {b!: <10}
) would explicitly require that one of these fields be present.Example: require user to specify values
This proposal allows for requiring a configuration to specify concrete values:
a!: [...]
a!: {...}
a!: 1
This can be useful, for instance, to require the use to explicitly specify a discriminator field:
Example: discriminator fields
The use of
!
can signal the evaluator the presence of discriminator fields. This could form the basis of a substantial performance boost for many configurations. Tooling could later help to annotate configurations with such fields (detecting discriminator fields is akin to anti-unification).Consider:
In this case, the user is required to explicitly specify the discriminator field. But even if the
!
is dropped from#Service.kind
, as a convenience to the user, the!
from#Object
would still signal that this is likely a discriminator field. The same holds if a#MyService: #Service & kind: “Service”
instantiation would make the field concrete.Example: at least one of
The required field can be used to specify that at least one of a collection of fields should be concrete:
If both were concrete, the resulting value will be identical and duplicate elimination will take care of the disambiguation.
numconcrete
builtinThe
numconcrete
builtin allows mutual exclusivity of values where this would be hard or expensive to express using the required field construct introduced above. It has the additional benefit that it more directly conveys the intent of the user.In many cases it is possible to express the required field annotation in terms of this builtin. In For instance,
a!: int
could be written asHowever, it is not possible to express requiring the user to specify a specific concrete value, such as
a!: 1
. For this reason, as well as for convenience in specifying required fields in policy definitions, it makes sense to have both constructs.Things are a bit tricker when requiring fields of structs or lists to be concrete. As these are always concrete, all values would pass.
Need for
?
In the current proposal, we rely on
?
to allow specifying a non-concrete struct for a field. Alternative, this could be done by a builtin, likemust
:or some dedicated builtin. The must builtin (as proposed in #575) would essentially remain incomplete as long as it is not unified with a concrete value.
The use of
?
, though, seems more elegant.numvalid
Considering the need for
?
, it may be useful to also have a builtinnumexist
which counts the number of valid references (values that are not an error). This has the advantage that it will enforce the consistent use of?
for those fields that are part of a oneOf, making them stand out a bit more. In the case of Protobuffers, this works well, and may more accurately reflect intent.We considered using
numvalid
instead ofnumexist
, but it does not cover all cases correctly. A separate builtin proposal should lay out all the builtins and their relevant consistency.Naming
numexist
seems to accurately describe what the builtin does. It may not be the most intuitive naming, though.numrequired
seems on the surface to be a likely candidate, but it would be a misnomer, as this is specifically not what that means. (In fact, the term required field is also not quite correct, even though it conveys its predominant use case.)An possible name, albeit less descriptive, could be
numof
. As may be confusing, however, as it alludes to the JSON schema constructsanyOf
andoneOf
, which are different in nature. In that sensenumof(count, ...expr)
would seem more to indicate a disjunction (like theor
builtin), where the user can indicate the number of disjuncts that should unify.Implementation
numconcrete
is a so-called non-monotonic construct: a failing condition may be negated by making a configuration more specific. CUE handles these constructs in a separate per-node post-evaluation validation phase. It must be careful to not return a permanent error if a resolution of the constraint is still possible. In this regard, it will be similar to the implementation ofstruct.MinFields
andstruct.MaxFields
.Other than that, the implementation is straightforward: evaluate the expressions, count the number of concrete values (non-recursively), counting incomplete errors as 0. The latter is necessary to allow non-required fields of structs and lists to fill this pattern.
vet rules
It may be surprising to the user that
always passes (
{}
is always considered concrete). A vet rule should detect use cases where an argument is always concrete, and suggest the use of?
or possiblynumexists
.Example: protobuf
To annotate protobuf oneOfs, under this proposal one could write
Note that this doesn’t change all that much for the current model and the current model would still work. The main difference is that it enables a stronger hint for early elimination of alternatives to the CUE compiler.
This notation also doesn’t address the embedding issue: it is still possible to add a field that before was mutually exclusive, even overriding its original type. For instance:
would evaluate to
Arguably, this comes with the territory of using embedding, which gives the power of extension, but disables checking: it is already the responsibility of the user to embed with care. Also, one can imagine having a
vet
check to guard against such likely mistaken use.That said, the
numexist
builtin would allow writing the above proto definition as:The use of optional fields here is unnecessary, but are needed to cover the general case, for instance if
a
andb
were fields with a struct value. One advantage is that this sets these fields visually apart from fields that are not subject to concreteness counts.Note how this closely resembles the “structural form” as used for OpenAPI for CRDs.
The resulting type cannot be extended to redefine
a
. Also, the definition gets rid of disjunctions, and defines the intent of the constraint more clearly. This, in turn, can help the CUE evaluator be more performant.Evaluation modes
We now briefly discuss the phases of CUE evaluation, to show how they will interplay with optional and non-concrete fields.
Manifestation
We proposed that in the manifestation phase we disambiguate disjuncts based on concrete values only. It will be important to not leak this final step into earlier phases of evaluation, such as default selection. Doing so may, for instance, cause a disjunct with arbitrary optional fields to be used for closedness checks.
Consider this example:
Purely on the basis of concrete values, these two are identical. However, simply picking the first or second when resolving
a.b
would give different results fora.b & {foo: 1}
.Doing disambiguation early, however, has quite considerable performance benefits. An implementation can work around this by clearly marking a result as ambiguous. For instance, deduped elements can have counts of the number of values that were collapsed into it.
Example: Disambiguating disjuncts.
With the new semantics of manifestation, the current example from issue #353 resolves as expected: consider
In this case all two disjuncts in the resulting disjunction will have the same concrete values. This will even be the case for non-optional fields if we allow non-concrete fields to be ignored.
Example: Schema simplification
A typical Kubernetes definition now looks like
littered with question marks.
Under this proposal, the above can be written as
This eliminates the majority of uses for
?
and marks required fields more explicitly.Printing modes
cue
printingThe majority of use cases in cue seem to be to use
cue export
orcue eval
along similar lines.We propose a “default command”
cue <selection>
which uses manifestation with the following specification:Omitting non-concrete values from the output avoids flooding the printing from non-concrete fields merged in from definitions. For similar reasons CUE omits optional fields (those with a
?
) from the output today. So the choice to omit non-concrete values is a logical consequence of allowing users to omit the use of?
in the majority of cases in schema.Ignoring non-concrete values when exporting is a departure from how
cue export
works and how generally constraints are enforced in CUE. To provide backwards compatibility users could use the-c
flag toPrinting modes:
_foo#bar
-style package classification. This may be a debug option to reflect it is not valid CUE as of this moment.The default printing mode is CUE. Output formats can be chosen with the flag
--out
.cue export
Much of the current functionality of
cue export
is reproduced in commandcue
. We propose to repurpose theexport
command adding the functionality described below.Backwards compatibility
cue export
would differ in one major way: non-concrete fields would be omitted from export; only those explicitly marked as required would result in an error when not concrete.The old behavior can be obtained by using the
-c
flag, just as one would have to use it today forcue eval
andcue vet
, making behavior between all of these more consistent.File output
cue export
would otherwise be repurposed to be the inverse ofcue import
. It would be like command-lesscue
, but would interpret@export
attributes to generate specific files.More specifically, any struct may contain one or more export attributes that evaluates to a CUE “filetypes” specifier to direct export to write a file of a certain type.
Consider the following CUE file.
This would instruct
cue export
to generate two files.By default the files will be exported in the
txtar
format as follows:Note that
bar.yaml
represents a JSON schema here.The comment section of the txtar output (above all files) would describe how to reconstruct the original file from the generated files. Note that this example utilizes several currently unsupported features, like JSON imports and JSON Schema output.
Options:
cue def
The define command
cue def
will remain as the main way to simplify CUE schema without resolving them to concrete values.It is allowed to simplify disjunctions (retaining semantics), but it may be on a best-effort basis. Default values are retained
Special features to be addressed in a different design doc could include:
Query extension
One big motivation for this proposal is to narrow down some of the gaps needed for the query proposal (see #165). One such gap was how to define queries and whether such queries should only return concrete matches or everything.
It is not the goal of this proposal to fully define these remaining gaps, but at least to show in detail how the main proposal interacts with this proposal and solves some remaining puzzles.
Selectors
CUE is different from other query languages in that a user can query concrete and non-concrete values. At the same time, we would like to avoid confusion between these two modes of operation. In particular, we need to define what to expect for selecting values in projections.
CUE currently has one form of selection:
a.foo
. To understand the various modes of selection in projections, we also propose a variant of this:a.foo?
. For regular selection they are defined as follows:foo
if it exists ina
.foo
is never allowed in aa.foo
, but instead of an incomplete error it would return the constraints forfoo
iffoo
were defined as_
.So
a.foo?
wherea
is a struct is equivalent to(a&{foo:_)).foo
, andb.1?
, whereb
is a list (allowing integer indexes for lists in selection) is equivalent to(b & [_, _] )[1]
. Thefoo?
variant works around a common issue reported by users.Now pulling in the query proposal (#165), let’s consider the equivalent for projections, that is, how these selectors behave for a “stream” of values that is the result of a query. Firstly, we expect that typical query usage will either center on querying concrete data, or on API definitions that have not been manifested.
To facilitate this view and translating the semantics of selectors to that of projections, we assume that for the purpose of projections a non-concrete value (like
<10
) is treated as an “incomplete” error.Given this definition, we can then define:
Forms 1 and 2 will fail if an illegal value is selected (fatal error).
The semantics of treating non-concrete values as an incomplete error when querying was partly chosen to be consistent with the proposed default mode for
cue
commands to silently ignore non-concrete values (unless the-c
option is used), making behavior consistent and predictable across the spectrum.Note that there is precedence within CUE to expect values to be concrete. For instance, operands most binary expressions (except
&
and|
) will result in an incomplete error when not concrete. The proposed semantics for queries is identical.Querying with subsumption (instance-of relation)
The
!
notation solves an issue with allowing a value to be used as subsumption in query filters. Consider the following:Using a subsumption filter
{name: string}
would also matchfoo
, as it is, strictly speaking subsumed. Using!
, we can work around this:will select only
bar
.We could require that if a field is specified in a query we required it to be concrete. That is a bit of an odd rule, though. The
!
notation seems a natural solution.Subsumption variants
Subsumption in the general case is an expensive operation. Using the definition of the different evaluation modes, however, we can distinguish two different kind of subsumption:
Note that closed structs are defined in terms of pattern constraints, so any closed struct classifies as 2.
Patterns of type 1 would be executed as actual subsumption.
For patterns of type 2, however, we would require that the queries values must have been explicitly unified with this value. For instance, the query
would search for any value in
a
that were unified withv1.#Service
. So a value that has all the same fields in a as a#Service
would still not match unless it was unified with this explicit definition. In effect, this introduces a notion of explicit typing, rather than just relying on isomorphic equivalence.Such selection is easy to implement efficiently, and may be a good compromise.
Transition
Although this is a big change to the language, we foresee a relatively smooth transition. The meaning of
?
would largely remain unaltered.Phase 0
Introduce the new disambiguation semantics. This should be done before v0.3. Although somewhat different, v0.2 has similar semantics, and introducing this before a final release will allow for a smoother transition.
Phase 1
In phase one we would introduce the
!
annotation and thenumconcrete
builtin to work as proposed.Phase 2
Add an experiment environment variable, like CUE030EXPREQUIRED to enable the new semantics eliding non-concrete fields. In this mode, the
-c
flag would mimic old behavior.A
cue fix
flag allows users to rewrite their CUE files to the new semantics.If the flat does not allow for a fine-grained enough transition, we could consider defining a transitionary field attribute to define the interpretation of such field on a per-field level.
Phase 3
Decide on whether to proceed with step 4.
Phase 4
The biggest change will be moving to relaxing the rules for non-concrete fields and moving away from excessive usage of
?
. This would be done as a minor pre-1.0.0 change (e.g. v0.4.0 or v0.5.0).The biggest issue is for installations that rely on not using
?
meaning required. It will be good to ask users whether the use ofcue fix
and/or-c
is sufficient or whether an API feature is supported as well.Adding a feature to
cue fix
to “promote” fields on a one-off basis would be straightforward. Generated configurations could just be regenerated.Note that default values and other concrete values specified in definitions would still be printed. It is only the non-concrete values that are omitted.
The removal of
?
may also have performance implications, as CUE processes them differently internally. The implementation can be adjusted however to overcome performance issues. Experience with v0.2, however, which processed optional fields similarly to regular fields, showed that the performance impact of this is relatively small. Structure sharing can further mitigate this issue, and we should probably ensure this is implemented before the transition.Alternatives considered
Alternative disjunction simplifications
We also considered eliminating non-concrete values from disjunctions. For instance, at manifestation time (only!):
could in such a case be simplified to
a: 1
. This would obviate the need for the default marker in this case.The overall intuition here is that this would be weird, though.
By extension, we also chose not to simplify
To achieve this effect without using defaults, users would have to write
A variant that would allow such simplification is open for consideration, though, especially if it can help fully deprecating the use of
?
.Other definitions of foo!: int
We have considered the following meanings of
foo!: int
foo!: int
as an optional fieldfoo!: bar
is an optional field that must unify with any concrete field to be valid.It sort of would still work if people would diligently use
?
for fields in definitions, but this would defeat one of the main purposes of introducing!
in the first place.But logically the
foo: int > foo!: int
relation makes sense, as the!
constrains the regular field. This alone gave too much of a contradiction to work well.require to be unified with a non-definition field with concrete value
The main advantage of this approach is that it would allow adding the required constraint to a schema that already defines that field as a concrete value.
The main drawback of this approach is that it is not possible to create a derivative definition of a schema that defines a required field that fills in the required field for the user.
For instance, suppose we have a definition
And we create the derivative:
then
kind
would still not be set.It seems that this should be possible, though. Specifying a concrete field in this case is akin to setting a default value. In general it is considered good CUE style to define defaults separately from an API, so this would be consistent with that definition.
Also, although the CUE evaluator can track the origin of fields quite easily, there is no representation for a “field that has already been unified with a concrete field.
require to be unified with a non-definition field and a concrete value
The distinction from the former is that the concrete value may originate from a definition. This definition however, is not associative.
Required indication on the right-hand side
We considered writing
foo: int!
orfoo: required(int)
instead offoo!: int
making the requiredness a property of the value instead of the field.This didn't sit well with the requirement of needing to unify with a concrete value from an optional field:
{foo?: 1} & {foo: int!}
would be hard to represent correctly in CUE. A goal is to allow representing evaluation results in CUE itself, but we did not see a way to accomplish that here.List type
One motivation for this proposal was for the ability to define a non-concrete list. For this we considered introducing Go style
types.
However, indicating the size can already done by syntax introduced in the query notation:
which would just be a generalization of CUE.
Also, this would still not solve the same problem for non-concrete structs or scalar values.
So overall, this would be a heavy-weight solution for little benefit.
Querying using unification
The
!
operator would also be useful for querying values using unification. Normally, a query likea.[: {name: 2}]
would produce quite unexpected results: it would unify with any element that either hasname
set to2
or for whichname
is undefined, possibly setting that value to2
.To avoid this, users could write
a.[: {name!: 2}]
.We considered this to be too cumbersome and surprising to be a viable solution, though.
Alternate definitions for querying by subsumption
There are really many variants possible. We mention a few. All of these could be considered.
Definitions as types
This would allow queries like
but perhaps not more funky queries using pattern constraints.
Definitions as types, subsume concrete values only
If the subsumption is a definition, the subsumed instances must have unified with this value.
For non-definitions, we only match the manifestation (concrete values) of the value.
This would allow full pattern matching.
Always subsume manifested values only
This would allow unrestricted unification. This seems limited, though, as people may want to query APIs with certain properties.
The syntax
a.*.[:{}]?
could be used to query the non-manifested value. Similar restrictions may still have to be applied to subsumption in this mode though, though they would typically be irrelevant to the casual user.Alternative semantics for projection selectors
We’ve considered a more direct correspondence between selectors for projection and regular selection. This results in the following definitions
For the common use case of requiring concrete data, this would mean that users would have to almost always use the third form. This seems undesirable and will likely result in too many gotchas. In the end, we were able to get the desired behavior for selectors in projections by only considering a non-concrete value to be an “incomplete” error. This seems to be a reasonable solution. Consider also that interpreting a non-concrete value as incomplete already happens at various points in CUE evaluation.
The text was updated successfully, but these errors were encountered: